Difference between revisions of "GHC/TypeSigsAndAmbiguity"

From HaskellWiki
< GHC
Jump to navigation Jump to search
(New page: == Type signatures and ambiguity == It's quite common for people to write a function definition without a type signature, load it into GHCi, use <tt>:t</tt> to see what type it has, and t...)
 
(Update example. Needs -XAllowAmbiguousTypes)
 
Line 3: Line 3:
 
It's quite common for people to write a function definition without a type signature, load it into GHCi, use <tt>:t</tt> to see what type it has, and then cut-and-paste that type into the source code as a type signature. Usually this works fine, but alas not always. Perhaps this is a deficiency in GHC, but here's one way it can happen:
 
It's quite common for people to write a function definition without a type signature, load it into GHCi, use <tt>:t</tt> to see what type it has, and then cut-and-paste that type into the source code as a type signature. Usually this works fine, but alas not always. Perhaps this is a deficiency in GHC, but here's one way it can happen:
 
<haskell>
 
<haskell>
  +
{-# LANGUAGE MultiParamTypeClasses, AllowAmbiguousTypes #-}
  +
 
class C a b where
 
class C a b where
 
foo :: a -> b
 
foo :: a -> b
Line 12: Line 14:
 
f x = konst (foo x)
 
f x = konst (foo x)
 
</haskell>
 
</haskell>
If you compile this code, you'll get this error:
+
If you compile this code with ghc-8.0.1, you'll get this error:
 
<pre>
 
<pre>
Foo1.hs:12:13:
+
Test.hs:10:14: error:
Could not deduce (C a b1) from the context (C a b)
+
Could not deduce (C a a0) arising from a use of ‘foo’
arising from use of `foo' at Foo1.hs:12:13-17
+
from the context: C a b
Possible fix: add (C a b1) to the type signature(s) for `f'
+
bound by the type signature for:
  +
f :: C a b => a -> Bool
In the first argument of `konst', namely `(foo x)'
 
  +
at Test2.hs:9:1-25
In the expression: konst (foo x)
 
  +
The type variable ‘a0’ is ambiguous
In the definition of `f': f x = konst (foo x)
 
  +
Relevant bindings include
  +
x :: a (bound at Test2.hs:10:3)
  +
f :: a -> Bool (bound at Test2.hs:10:1)
 
In the first argument of ‘konst’, namely (foo x)
 
In the expression: konst (foo x)
 
In an equation for ‘f’: f x = konst (foo x)
 
</pre>
 
</pre>
 
What's going on? GHC knows, from the type signature that <tt>x::a</tt>. Then applying <tt>foo</tt> means GHC must pick a return type for <tt>foo</tt>, say <tt>b1</tt>, and generates the type constraint <tt>(C a b1)</tt>. The function <tt>konst</tt> just discards its argument, ''so nothing further is known about <tt>b1</tt>''.
 
What's going on? GHC knows, from the type signature that <tt>x::a</tt>. Then applying <tt>foo</tt> means GHC must pick a return type for <tt>foo</tt>, say <tt>b1</tt>, and generates the type constraint <tt>(C a b1)</tt>. The function <tt>konst</tt> just discards its argument, ''so nothing further is known about <tt>b1</tt>''.

Latest revision as of 11:30, 24 January 2016

Type signatures and ambiguity

It's quite common for people to write a function definition without a type signature, load it into GHCi, use :t to see what type it has, and then cut-and-paste that type into the source code as a type signature. Usually this works fine, but alas not always. Perhaps this is a deficiency in GHC, but here's one way it can happen:

{-# LANGUAGE MultiParamTypeClasses, AllowAmbiguousTypes #-}

class C a b where
  foo :: a -> b

konst :: a -> Bool
konst x = True

f :: (C a b) => a -> Bool
f x = konst (foo x)

If you compile this code with ghc-8.0.1, you'll get this error:

Test.hs:10:14: error:
    • Could not deduce (C a a0) arising from a use of ‘foo’
      from the context: C a b
        bound by the type signature for:
                   f :: C a b => a -> Bool
        at Test2.hs:9:1-25
      The type variable ‘a0’ is ambiguous
      Relevant bindings include
        x :: a (bound at Test2.hs:10:3)
        f :: a -> Bool (bound at Test2.hs:10:1)
    • In the first argument of ‘konst’, namely ‘(foo x)’
      In the expression: konst (foo x)
      In an equation for ‘f’: f x = konst (foo x)

What's going on? GHC knows, from the type signature that x::a. Then applying foo means GHC must pick a return type for foo, say b1, and generates the type constraint (C a b1). The function konst just discards its argument, so nothing further is known about b1.

Now GHC finished typechecking the right hand side of f, so next it checks that the constraints needed in the RHS, namely (C a b1), can be satisfied from the constraints provided by the type signature, namely (C a b). Alas there is nothing to tell GHC that b and b1 should be identified together; hence the complaint. (Probably you meant to put a functional dependency in the class declaration, thus

class C a b | a->b where ...

but you didn't.)

The surprise is that if you comment out the type signature for f, the module will load fine into GHCi! Furthermore :t will report a type for f that is exactly the same as the type signature that was rejected!

Here's what's happening. Without the type signature, GHC picks an arbitrary type for x, say x::a. Then applying foo means GHC must pick a return type for foo, say b, and generates the type constraint (C a b). The function konst just discards its argument, so nothing further is known about b. Finally, GHC gathers up all the constraints arising from the right hand side, namely (C a b), and puts them into the inferred type of f. So GHC ends up saying that f :: (C a b) => a -> Bool.

This is probably a very stupid type. Suppose you called f thus: (f 'a'). Then you'd get a constraint (C Char b) where nothing is known about b. If the instances of C constrain both type parameters, you'd be in trouble:

instance C Char Bool where ...

The call gives a (C Char b) constraint, with absolutely no way to fix b to be Bool, or indeed anything else. We're back to very much the same situation as before; it's just that the error is deferred until we call f, rather than when we define it.

(However, notice that the call (f 'a') would be OK if there was an instance like:

instance C Char w where ...

Now the constraint (C Char b) matches the instance declaration, even though we know nothing about b.)

This behaviour isn't ideal. It really only arises in programs that are ambiguous anyway (that is, they could never really work), but it is undoubtedly confusing. But I don't know an easy way to improve it. Yet, anyway.