Search code examples
haskellhomoiconicity

Pattern bindings for existential constructors


While writing Haskell as a programmer that had exposure to Lisp before, something odd came to my attention, which I failed to understand.

This compiles fine:

{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE ExistentialQuantification #-}
data Foo = forall a. Show a => Foo { getFoo :: a }

showfoo :: Foo -> String
showfoo Foo{getFoo} = do
  show getFoo

whereas this fails:

{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE ExistentialQuantification #-}
data Foo = forall a. Show a => Foo { getFoo :: a }

showfoo :: Foo -> String
showfoo foo = do
  let Foo{getFoo} = foo
  show getFoo

To me it's not obvious why the second snippet fails.

The question would be:

Do I miss something or stems this behaviour from the fact that haskell is not homoiconic?

My reasoning is, given that:

  1. Haskell needs to implement record pattern matching as a compiler extension, because of it's choice to use syntax rather than data.

  2. Matching in a function head or in a let clause are two special cases.

It is difficult to understand those special cases, as they cannot be either implemented nor looked up directly in the language itself.

As an effect of this, consistent behaviour throughout the language is not guaranteed. Especially together with additional compiler extensions, as per example.

ps: compiler error:

error:
    • My brain just exploded
      I can't handle pattern bindings for existential or GADT data constructors.
      Instead, use a case-expression, or do-notation, to unpack the constructor.
    • In the pattern: Foo {getFoo}
      In a pattern binding: Foo {getFoo} = foo
      In the expression:
        do { let Foo {getFoo} = foo;
             show getFoo }

edit: A different compiler version gives this error for the same problem

* Couldn't match expected type `p' with actual type `a'
    because type variable `a' would escape its scope
  This (rigid, skolem) type variable is bound by
    a pattern with constructor: Foo :: forall a. Show a => a -> Foo

Solution

  • I thought about this a bit and albeit the behaviour seems odd at first, after some thinking I guess one can justify it perhaps thus:

    Say I take your second (failing) example and after some massaging and value replacements I reduce it to this:

    data Foo = forall a. Show a => Foo { getFoo :: a }
    
    main::IO()
    main = do
        let Foo x = Foo (5::Int)
        putStrLn $ show x
    

    which produces the error:

    Couldn't match expected type ‘p’ with actual type ‘a’ because type variable ‘a’ would escape its scope

    if the pattern matching would be allowed, what would be the type of x? well.. the type would be of course Int. However the definition of Foo says that the type of the getFoo field is any type that is an instance of Show. An Int is an instance of Show, but it is not any type.. it is a specific one.. in this regard, the actual specific type of the value wrapped in that Foo would become "visible" (i.e. escape) and thus violate our explicit guarantee that forall a . Show a =>...

    If we now look at a version of the code that works by using a pattern match in the function declaration:

    data Foo = forall a . Show a => Foo { getFoo :: !a }
    
    unfoo :: Foo -> String
    unfoo Foo{..} = show getFoo
    
    main :: IO ()
    main = do
        putStrLn . unfoo $ Foo (5::Int)
    

    Looking at the unfoo function we see that there is nothing there saying that the type inside of the Foo is any specific type.. (an Int or otherwise) .. in the scope of that function all we have is the original guarantee that getFoo can be of any type which is an instance of Show. The actual type of the wrapped value remains hidden and unknowable so there are no violations of any type guarantees and happiness ensues.

    PS: I forgot to mention that the Int bit was of course an example.. in your case, the type of the getFoo field inside of the foo value is of type a but this is a specific (non existential) type to which GHC's type inference is referring to (and not the existential a in the type declaration).. I just came up with an example with a specific Int type so that it would be easier and more intuitive to understand.