Combinatory logic: Difference between revisions
EndreyMark (talk | contribs) m (Typesetting all combinators with boldface uniformly, so that they can be distiguished easily from variables and (other) metasigns) |
EndreyMark (talk | contribs) m (Some layout improvements with more appropriate horizontal spacings) |
||
Line 97: | Line 97: | ||
* <math>\mathbf{oneTwoThree} \equiv \mathbf{cons}\;\mathbf1\;\left( \mathbf{cons}\;\mathbf2\;\left(\mathbf{cons}\;\mathbf3\;\mathbf{nil}\right) \right)</math> | * <math>\mathbf{oneTwoThree} \equiv \mathbf{cons}\;\mathbf1\;\left( \mathbf{cons}\;\mathbf2\;\left(\mathbf{cons}\;\mathbf3\;\mathbf{nil}\right) \right)</math> | ||
the expression | the expression | ||
* <math>\mathbf{ | * <math>\mathbf{oneTwoThree}\;\mathbf+\;\mathbf0</math> | ||
reduces to | reduces to | ||
* \mathbf+\;\mathbf1 (\mathbf+\;\mathbf2 (\mathbf+\;\mathbf3\;\mathbf0)) | * <math>\mathbf+\;\mathbf1\;\left(\mathbf+\;\mathbf2\; \left(\mathbf+\;\mathbf3\;\mathbf0\right)\right)</math> | ||
But how to define <math>\mathbf{cons}</math> and <math>\mathbf{nil}</math>? | But how to define <math>\mathbf{cons}</math> and <math>\mathbf{nil}</math>? | ||
In <math>\lambda</math>-calculus, we should like to see the following reductions: | In <math>\lambda</math>-calculus, we should like to see the following reductions: |
Revision as of 14:28, 3 March 2006
Portals and other large-scale resources
Implementing CL
- Talks about it at haskell-cafe haskell-cafe
- Lot of interpreters at John's Lambda Calculus and Combinatory Logic Playground
- CL++, a lazy-evaluating combinatory logic interpreter with some computer algebra service: e.g. it can reply the question with instead of a huge amount of parantheses and , combinators. Unfortunately I have not written it directly in English, so all documantations, source code and libraries are in Hungarian. I want to rewrite it using more advanced Haskell programming concepts (e.g. monads or attribute grammars) and directly in English.
Programming in CL
I think many thoughts from John Hughes' Why Functional Programming Matters can be applied to programming in Combinatory Logic. And almost all concepts used in the Haskell world (catamorphisms etc.) helps us a lot here too. Combinatory logic is a powerful and concise programming language. I wonder how functional logic programming could be done by using the concepts of Illative combinatory logic, too.
Datatypes
Continuation passing for polynomial datatypes
Direct product
Let us begin with a notion of the ordered pair and denote it by . We know this construct well when defining operations for booleans
and Church numbers. I think, in generally, when defining datatypes in a continuation-passing way (e.g. Maybe or direct sum), then operations on so-defined datatypes often turn to be well-definable by some .
We define it with
in -calculus and
in combinatory logic.
A nice generalization scheme:
- as the construct can be generalized to any natural number (the concept of -tuple, see Barendregt's Calculus)
- and in this generalized scheme corresponds to the 0 case, to the 1 case, and the ordered pair construct to the 2 case, as though defining
so we can write definition
or the same
in a more interesting way:
Is this generalizable? I do not know. I know an analogy in the case of , , , .
Direct sum
The notion of ordered pair mentioned above really enables us to deal with direct products. What about it dual concept? How to make direct sums in Combinatory Logic? And after we have implemented it, how can we see that it is really a dual concept of direct product?
A nice argument described in David Madore's Unlambda page gives us a continuation-passig style like solution. We expect reductions like
so we define
now we translate it from -calculus into combinatory logic:
Of course, we can recognize Haskell's Either (Left, Right)
.
Maybe
Let us remember Haskell's maybe
:
maybe :: a' -> (a -> a') -> Maybe a -> a' maybe n j Nothing = n maybe n j (Just x) = j x
thinking of
- n as nothing-continuation
- j as just-continuation
In a continuation passing style approach, if we want to implement something like the Maybe constuct in -calculus, then we may expect the following reductions:
we know both of them well, one is just , and we remember the other too from the direct sum:
thus their definition is
where both and have a common definition.
Catamorphisms for recursive datatypes
List
Let us define the concept of list by its catamorphism (see Haskell's foldr
):
a list (each concrete list) is a function taking two arguments
- a two-parameter function argument (cons-continuation)
- a zero-parameter function argument (nil-continuation)
and returns a value coming from a term consisting of applying cons-continuations and nil-continuations in the same shape as the correspondig list. E. g. in case of having defined
the expression
reduces to
But how to define and ? In -calculus, we should like to see the following reductions:
Let us think of the variables as denoting head, denoting tail, denoting cons-continuation, and denoting nil-continuation.
Thus, we could achieve this goal with the following definitions:
Using the formulating combinators described in Haskell B. Curry's Combinatory Logic I, we can translate these definitions into combinatory logic without any pain:
Of course we could use the two parameters in the opposite order, but I am not sure yet that it would provide a more easy way.
A little practice: let us define concat. In Haskell, we can do that by
concat = foldr (++) []
which corresponds in cominatory logic to reducing
Let us use the ordered pair (direct product) construct:
and if I use that nasty (see later)
Monads in Combinatory Logic?
Concrete monads
Maybe as a monad
return
Implementing the return
monadic method for the Maybe monad is rather straightforward, both in Haskell and CL:
instance Monad Maybe return = Just ...
map
Haskell:
instance Functor Maybe where map f = maybe Nothing (Just . f)
-calculus: Expected reductions:
Definition:
Combinatory logic: we expect the same reduction here too
let us get rid of one parameter:
now we have the definition:
bind
Haskell:
instance Monad Maybe (>>=) where (>>=) f p = maybe Nothing f
-calculus: we expect
achieved by defintion
In combinatory logic the above expected reduction
getting rid of the outest parameter
yielding definition
and of course
But the other way (starting with a better chosen parameter order) is much better:
yielding the much simplier and more efficient definition:
We know already that can be seen as as a member of the scheme of tuples: for case. As the tupe construction is a usual guest at things like this (we shall meet it at list and other maybe-operations like ), so us express the above definition with denoted as :
hoping that this will enable us some interesting generalization in the future.
But why we have not made a more brave genralization, and express monadic bind from monadic join and map? Later in the list monad, we shall see that it may be better to avoid this for sake of deforestation. Here a maybe similar problem will appear: the problem of superfluous .
join
We should think of changing the architecture if we suspect that we could avoid and solve the problem with a more simple construct.
The list as a monad
Let us think of our list-operations as implementing monadic methods of the list monad. We can express this by definitions too, e.g.
we could name
Now let us see mapping a list, concatenating a list, binding a list. Mapping and binding have a common property: yielding nil for nil. I shall say these operations are centred: their definition would contain a subexpression. Thus I shall give a name to this subexpression:
Now let us define map and bind for lists:
now we see it was worth of defining a common . But to tell the truth, it may be a trap. breaks a symmetry: we should always define the cons and nil part of the foldr construct on the same level, always together. Modularization should be pointed towards this direction, and not to run forward into the T-street of .
Another remark: of course we can get the monadic bind for lists
But we used here. How do we define it? It is surprizingly simple. Let us think how we would define it in Haskell by foldr
, if it was not defined already as ++
defined in Prelude:
In defining
(++) list1 list2
we can do it by foldr
:
(++) [] list2 = list2 (++) (a : as) list2 = a : (++) as list2
thus
(++) list1 list2 = foldr (:) list2 list1
let us se how we should reduce its corresponding expression in Combinatory Logic:
thus
Thus, we have defined monadic bind for lists. I shall call this the deforested bind for lists. Of course, we could define it another way too: by concat and map, which corresponds to defining monadic bind from monadic map and monadic join. But I think this way forces my CL-interpreter to manage temporary lists, so I gave rather the deforested definition.
Defining the other monadic operation: return for lists is easy:
instance Monad [] where return = (: [])
in Haskell -- we know,
(: [])
translates to
return = flip (:) []
so
How to AOP with monads in Combinatory Logic?
We have defined monadic list in CL. Of course we can make monadic Maybe, binary tree, Error monad with direct sum constructs...
But separation of concerns by monads is more than having a bunch of special monads. It requires other possibilities too: e.g. being able to use monads generally, which can become any concrete mondads.
Of course my simple CL interpreter does not know anything on type classes, overloading. But there is a rather restricted andstatic possibility provided by the concept of definition itself:
and later we can change the binding mode named A e.g. from a failure-handling Maybe-like one to a more general indeterminism-handling list-like one, then we can do that simply by replacing definition
with definition
Illative Combinatory Logic
Systems of Illative Combinatory Logic complete for first-order propositional and predicate calculus by Henk Barendregt, Martin Bunder, Wil Dekkers.
I think combinator can be thought of as something analogous to Dependent types: it seems to me that the dependent type construct of Epigram corresponds to in Illative Combinatory Logic. I think e.g. the followings should correspond to each other:
My dream is making something in Illative Combinatory Logic. Maybe it could be theroretical base for a functional logic language?