Personal tools


From HaskellWiki

(Difference between revisions)
Jump to: navigation, search
(merge wikibooks arrow intros)
(link to orphaned page)

Revision as of 15:38, 3 May 2007

Arrow class (base)
import Control.Arrow


1 Practice

Reasons, when it may be worth of solving a specific problem with arrows (instead of monads) can be read in a message from Daan Leijen.

2 Library

3 Examples

Various concepts follow here, which can be seen as concrete examples covered by the arrow concept. Not all of them provide links to Haskell-related materials: some of them are here only to give a self-contained material (e.g. section #Automaton gives links only to the finite state concept itself.).

3.1 Function

Arow operations
are rather straightforward. For implementing
and related concepts, see Prelude extensions#Tuples.

3.2 Parser

The reasons why the arrow concept can solve important questions when designing a parser library are explained in Generalising Monads to Arrows written by John Hughes.

A good example of the mentioned arrow parsers can be seen in A New Notation for Arrows written by Ross Paterson: figure 2, 4, 6 (page 3, 5, 6):

\mathrm{expr} ::= \mathrm{term}\;\mathrm{exprTail}
\mathrm{exprTail} ::= \mathbf{PLUS}\;\mathrm{term}\;\mathrm{exprTail}\;\vert\;\mathbf{MINUS}\;\mathrm{term}\;\mathrm{exprTail}\;\vert\;\epsilon

is represented with arrow parsers this way:

 data Expr = Plus Expr Expr | Minus Expr Expr | ...
 expr :: ParseArrow () Expr
 expr = proc () -> do
         t <- term -< ()
         exprTail -< t
 exprTail :: ParseArrow Expr Expr
 exprTail = proc e -> do
         symbol PLUS -< ()
         t <- term -< ()
         exprTail -< Plus e t
    <+> do
         symbol MINUS -< ()
         t <- term -< ()
         exprTail -< Minus e t
    <+> returnA -< e

An arrow parser library: PArrows written by Einar Karttunen.

Another arrow parser implementation: LLParser.hs written by Antti-Juhani Kaijanaho (I read the reference to it in Shae Erisson's blog / journal).

The funny thing which took a long time for me to understand arrow parsers is a sort of differential approach -- in contrast to the well-known parser approaches. (I mean, in some way well-known parsers are of differential approach too, in the sense that they manage state transitions where the states are remainder streams -- but here I mean being differential in another sense: arrow parsers seem to me differential in the way how they consume and produce values -- their input and output.)

The idea of borrowing this image from mathematical analysis comes from another topic: the version control systems article Integrals and derivatives written by Martin Pool uses a similar image.

Arrows and Computation written by Ross Paterson (pages 2, 6, 7) and ProdArrows -- Arrows for Fudgets

written by Magnus Carlsson (page 9) mentions that computation (e.g. state) is threaded through the operands of
operation. I mean, even the mere definition of
 p &&& q = arr dup >>> first p >>> second q
shows that the order of the computation (the side effects) is important when using
, and this can be exemplified very well with parser arrows. See an example found in PArrows written by Einar Karttunen (see module
 -- | Match zero or more occurrences of the given parser.
 many :: MD i o -> MD i [o]
 many = MStar
 -- | Match one or more occurrences of the given parser.
 many1 :: MD i o -> MD i [o]
 many1 x = (x &&& MStar x) >>> pure (\(b,bs) -> (b:bs))
The definition of
parser combinator can show another example for the importance of the order in which the computation (e.g. the side effects) take place using
 between :: MD i t -> MD t close -> MD t o -> MD i o
 between open close real = open >>> (real &&& close) >>^ fst

A more complicated example (from the same module):

 -- | Match one or more occurrences of the given parser separated by the separator.
 sepBy1 :: MD i o -> MD i o' -> MD i [o]
 sepBy1 p s = (many (p &&& s >>^ fst) &&& p) >>^ (\(bs,b) -> bs++[b])
This makes clear that the order of effects of the operands of
operation can be important. But let us mention also a counterexample, e.g. nondeterministic functions arrows, or more generally, the various implementations of binary relation arrows -- there is no such sequencing of effect orders. Now let us see this fact on the mere mathematical concept of binary relations (not minding how it implemented):
\sigma) x \left\langle y_0, y_1\right\rangle \Leftrightarrow x \rho y_0 \land x \sigma y_1
\sigma) \left(i:x\right) y \Leftrightarrow i\begin{cases}0:&x\rho y\\1:&x\sigma y\end{cases}
The picture illustrating
in Programming:Haskell_arrows article of Wikibooks suggests exactly such a view: order of side effects can be unimportant at some arrow instances, and the symmetry of the figure reflects this. In generally, however, the figure should use a notation for threading through side effects in a sequence.

3.3 Stream processor

The Lazy K programming language is an interesting esoteric language (from the family of pure, lazy functional languages), whose I/O concept is approached by streams.

Arrows are useful also to grasp the concept of stream processors. See details in

3.3.1 Functional I/O, graphical user interfaces

3.3.2 Dataflow languages

Arrows and Computation written by Ross Paterson mentions how to mimic dataflow programming in (lazy) functional languages. See more on Lucid's own HaskellWiki page: Lucid.

4 Automaton

To see what the concept itself means, see the Wikipedia articles Finite state machine and also Automata theory.

How these concepts can be implemented using the concept of arrow, can be found in the introductory articles on arrows mentioned above.

5 External links

6 See also