Difference between revisions of "Recursive function theory"

From HaskellWiki
Jump to navigation Jump to search
(Rewriting a section and renaming it to →‎Motivations: . It discusses the (mainly only indirect) relatedness of this page to Haskell.)
(Lack of variables gives a feeling resembling to pointfree style)
Line 14: Line 14:
 
Well-known concepts are taken from [Mon:MatLog], but several new notations (only notations, not concepts) are introduced to reflect all concepts described in [Mon:MatLog], and some simplification are made (by allowing zero-arity generalizations). These are plans to achive formalizations that can allow us in the future to incarnate the main concepts of recursive function theory in a ''toy'' programming language, to play with it so that some interesting concepts can be taught in a funny way.
 
Well-known concepts are taken from [Mon:MatLog], but several new notations (only notations, not concepts) are introduced to reflect all concepts described in [Mon:MatLog], and some simplification are made (by allowing zero-arity generalizations). These are plans to achive formalizations that can allow us in the future to incarnate the main concepts of recursive function theory in a ''toy'' programming language, to play with it so that some interesting concepts can be taught in a funny way.
   
The relatedness of this page to Haskell (and to funtional programming) is very few. It seems to me that (programming in) recursive functional theory may be another world. But maybe this page can be useful for someone in the future
+
The relatedness of this page to Haskell (and to funtional programming) is very few. It seems to me that (programming in) recursive functional theory may be another world (e.g. currying is missing, too) -- although the lack of variables (even lack of formal parameters) can yield a feeling resembling to [[pointfree]] style programming, or even to [[combinatory logic]].
  +
  +
But despite of its weak (direct) relatedness to functional programming, maybe this page can be useful for someone in the future
 
* e.g. when writing on quines (self-printing programs or self-representing formulas) etc. David Madore's [http://www.madore.org/~david/computers/quine.html Quines (self-replicating programs)] page uses recursive function theory for explaining the theroetical roots of quines (e.g. fixed point theorem)
 
* e.g. when writing on quines (self-printing programs or self-representing formulas) etc. David Madore's [http://www.madore.org/~david/computers/quine.html Quines (self-replicating programs)] page uses recursive function theory for explaining the theroetical roots of quines (e.g. fixed point theorem)
* or when writing on other general concepts of [[Computer science]].
+
* or when writing on other general concepts of [[computer science]].
   
 
Other few relatedness of this topic to Haskell may appear by the fact that the Haskell implementation of the mentioned toy programming language may use
 
Other few relatedness of this topic to Haskell may appear by the fact that the Haskell implementation of the mentioned toy programming language may use

Revision as of 16:34, 6 May 2006

Introduction

Designed languages

  • Dr Matt Fairtlough's Minimal Programming Language (MIN) is not exactly a recursive function theory language, but it is based on natural numbers, too and its equivalent power with partal recursive functions is shown in its description.

Implementations

In Haskell, among other implementations (e.g. written in Java) in Dr Matt Fairtlough's lecture notes (see the bottom of the page).

Motivations

Well-known concepts are taken from [Mon:MatLog], but several new notations (only notations, not concepts) are introduced to reflect all concepts described in [Mon:MatLog], and some simplification are made (by allowing zero-arity generalizations). These are plans to achive formalizations that can allow us in the future to incarnate the main concepts of recursive function theory in a toy programming language, to play with it so that some interesting concepts can be taught in a funny way.

The relatedness of this page to Haskell (and to funtional programming) is very few. It seems to me that (programming in) recursive functional theory may be another world (e.g. currying is missing, too) -- although the lack of variables (even lack of formal parameters) can yield a feeling resembling to pointfree style programming, or even to combinatory logic.

But despite of its weak (direct) relatedness to functional programming, maybe this page can be useful for someone in the future

  • e.g. when writing on quines (self-printing programs or self-representing formulas) etc. David Madore's Quines (self-replicating programs) page uses recursive function theory for explaining the theroetical roots of quines (e.g. fixed point theorem)
  • or when writing on other general concepts of computer science.

Other few relatedness of this topic to Haskell may appear by the fact that the Haskell implementation of the mentioned toy programming language may use

  • tricks with types, type arithmetic
  • or metaprogramming concepts, at worst preprocessing steps

because type-safe implementations of and does not look straightforward for me.

Primitive recursive functions

Type system

Initial functions

Constant

This allows us to deal with a concept of zero in recursive function theory. In the literature (in [Mon:MathLog], too) this aim is achieved in another way: a

is defined instead. Is this approach superfluously overcomplicated? Can we avoid it and use the more simple and indirect looking

approach?

Are these approaches equivalent? Is the latter (more simple looking) one as powerful as the former one? Could we define a using the

approach? Let us try:

(see the definition of somewhat below). This looks like working, but raises new questions: what about generalizing operations (here: composition) to deal with zero-arity cases in an appropriate way? E.g.

where can be regarded as -ary functions throwing all their arguments away and returning .

Does it take a generalization to allow such cases, or can they be inferred? A practical approach to solve such questions: let us write a Haskell program which implements (at least partially) recursive function theory. Then we can see clearly which things have to be defined and things which are consequences. I think the construct is a rather straighforward thing.

Why all this can be important: it may be exactly that saves us from defining the concept of zero in recursive function theory as

-- it may be superfluous: if we need functions that throw away (some or all) of their arguments and return a constant, then we can combine them from , and , if we allow concepts like .

Successor function

Projection functions

For all :

Operations

Composition

This resembles to the combinator of Combinatory logic (as described in [HasFeyCr:CombLog1, 171]). If we prefer avoiding the notion of the nested tuple, and use a more homogenous style (somewhat resembling to currying):

Let underbrace not mislead us -- it does not mean any bracing.

remembering us to

Primitive recursion

The last equation resembles to the combinator of Combinatory logic (as described in [HasFeyCr:CombLog1, 169]):

General recursive functions

Everything seen above, and the new concepts:

Type system

See the definition of being special [Mon:MathLog, 45]. This property ensures, that minimalization does not lead us out of the world of total functions. Its definition is the rather straightforward formalization of this expectation.

It resembles to the concept of inverse -- more exactly, to the existence part.


Operations

Minimalization

Minimalization does not lead us out of the word of total functions, if we use it only for special functions -- the property of being special is defined exactly for this purpose [Mon:MatLog, 45]. As we can see, minimalization is a concept resembling somehow to the concept of inverse.

Existence of the required minimum value of the set -- a sufficient and satisfactory condition for this is that the set is never empty. And this is equivalent to the statement

Partial recursive functions

Everything seen above, but new constructs are provided, too.

Type system

Question: is there any sense to define in another way than simply ? Partial constant? Is

or

?

Operations

Their definitions are straightforward extension of the corresponding total function based definitions.

Remark: these operations take partial functions as arguments, but they are total operations themselves in the sense that they always yield a result -- at worst an empty function (as an ultimate partial function).

Bibliography

[HasFeyCr:CombLog1]
Curry, Haskell B; Feys, Robert; Craig, William: Combinatory Logic. Volume I. North-Holland Publishing Company, Amsterdam, 1958.
[Mon:MathLog]
Monk, J. Donald: Mathematical Logic. Springer-Verlag, New York * Heidelberg * Berlin, 1976.