https://wiki.haskell.org/index.php?title=GHC/TypeSigsAndAmbiguity&feed=atom&action=historyGHC/TypeSigsAndAmbiguity - Revision history2015-10-10T05:33:46ZRevision history for this page on the wikiMediaWiki 1.19.14+dfsg-1https://wiki.haskell.org/index.php?title=GHC/TypeSigsAndAmbiguity&diff=54579&oldid=prevSimonpj: New page: == Type signatures and ambiguity == It's quite common for people to write a function definition without a type signature, load it into GHCi, use <tt>:t</tt> to see what type it has, and t...2012-10-31T16:36:23Z<p>New page: == Type signatures and ambiguity == It's quite common for people to write a function definition without a type signature, load it into GHCi, use <tt>:t</tt> to see what type it has, and t...</p>
<p><b>New page</b></p><div>== Type signatures and ambiguity ==<br />
<br />
It's quite common for people to write a function definition without a type signature, load it into GHCi, use <tt>:t</tt> to see what type it has, and then cut-and-paste that type into the source code as a type signature. Usually this works fine, but alas not always. Perhaps this is a deficiency in GHC, but here's one way it can happen:<br />
<haskell><br />
class C a b where<br />
foo :: a -> b<br />
<br />
konst :: a -> Bool<br />
konst x = True<br />
<br />
f :: (C a b) => a -> Bool<br />
f x = konst (foo x)<br />
</haskell><br />
If you compile this code, you'll get this error:<br />
<pre><br />
Foo1.hs:12:13:<br />
Could not deduce (C a b1) from the context (C a b)<br />
arising from use of `foo' at Foo1.hs:12:13-17<br />
Possible fix: add (C a b1) to the type signature(s) for `f'<br />
In the first argument of `konst', namely `(foo x)'<br />
In the expression: konst (foo x)<br />
In the definition of `f': f x = konst (foo x)<br />
</pre><br />
What's going on? GHC knows, from the type signature that <tt>x::a</tt>. Then applying <tt>foo</tt> means GHC must pick a return type for <tt>foo</tt>, say <tt>b1</tt>, and generates the type constraint <tt>(C a b1)</tt>. The function <tt>konst</tt> just discards its argument, ''so nothing further is known about <tt>b1</tt>''. <br />
<br />
Now GHC finished typechecking the right hand side of <tt>f</tt>, so next it checks that the constraints ''needed'' in the RHS, namely <tt>(C a b1)</tt>, can be satisfied from the constraints ''provided'' by the type signature, namely <tt>(C a b)</tt>. Alas there is nothing to tell GHC that <tt>b</tt> and <tt>b1</tt> should be identified together; hence the complaint. (Probably you meant to put a functional dependency in the class declaration, thus<br />
<haskell><br />
class C a b | a->b where ...<br />
</haskell><br />
but you didn't.)<br />
<br />
The surprise is that if you comment out the type signature for <tt>f</tt>, the module will load fine into GHCi! Furthermore <tt>:t</tt> will report a type for <tt>f</tt> that is exactly the same as the type signature that was rejected!<br />
<br />
Here's what's happening. Without the type signature, GHC picks an arbitrary type for <tt>x</tt>, say <tt>x::a</tt>. Then applying <tt>foo</tt> means GHC must pick a return type for <tt>foo</tt>, say <tt>b</tt>, and generates the type constraint <tt>(C a b)</tt>. The function <tt>konst</tt> just discards its argument, so nothing further is known about <tt>b</tt>. Finally, GHC gathers up all the constraints arising from the right hand side, namely <tt>(C a b)</tt>, and puts them into the inferred type of <tt>f</tt>. So GHC ends up saying that <hask>f :: (C a b) => a -> Bool</hask>. <br />
<br />
This is probably a very stupid type. Suppose you called <tt>f</tt> thus: <tt>(f 'a')</tt>. Then you'd get a constraint <tt>(C Char b)</tt> where nothing is known about <tt>b</tt>. If the instances of <tt>C</tt> constrain both type parameters, you'd be in trouble:<br />
<haskell><br />
instance C Char Bool where ...<br />
</haskell><br />
The call gives a <tt>(C Char b)</tt> constraint, with absolutely no way to fix <tt>b</tt> to be <tt>Bool</tt>, or indeed anything else. We're back to very much the same situation as before; it's just that the error is deferred until we call <tt>f</tt>, rather than when we define it.<br />
<br />
(However, notice that the call <tt>(f 'a')</tt> would be OK if there was an instance like:<br />
<haskell><br />
instance C Char w where ...<br />
</haskell><br />
Now the constraint <tt>(C Char b)</tt> matches the instance declaration, even though we know nothing about <tt>b</tt>.)<br />
<br />
This behaviour isn't ideal. It really only arises in programs that are ambiguous anyway (that is, they could never really work), but it is undoubtedly confusing. But I don't know an easy way to improve it. Yet, anyway.</div>Simonpj