Arrow

From HaskellWiki
(Redirected from Arrows)
Jump to navigation Jump to search
Arrow class (base)
import Control.Arrow

Overview

Arrows, or Freyd-categories, are a generalization of Monads.

"They can do everything monads can do, and more. They are roughly comparable to monads with a static component." However "Arrows do have some problems". (need a more useful comparison)

For an introduction, see #External links.

Library

Examples

Various concepts follow here, which can be seen as concrete examples covered by the arrow concept. Not all of them provide links to Haskell-related materials: some of them are here only to give a self-contained material (e.g. section #Automaton gives links only to the finite state concept itself.).

Practice

Reasons, when it may be worth of solving a specific problem with arrows (instead of monads) can be read in a message from Daan Leijen. But Leijen's post is rather old (2000). Arrows are now significantly easier to understand and use than they were back then. Eg, his example might be rewritten

test = proc _ -> do
           question <- ask -< "what is the question ?"
           answer   <- ask -< question
           returnA -< ("the answer to '" ++ question ++ "' is " ++ answer)

(or something vaguely like that).

Function

Arrow operations arr and >>> are rather straightforward. For implementing first and related concepts, see Prelude extensions#Tuples.

Parser

The reasons why the arrow concept can solve important questions when designing a parser library are explained in Generalising Monads to Arrows written by John Hughes.

A good example of the mentioned arrow parsers can be seen in A New Notation for Arrows written by Ross Paterson: figure 2, 4, 6 (page 3, 5, 6):

is represented with arrow parsers this way:

 data Expr = Plus Expr Expr | Minus Expr Expr | ...

 expr :: ParseArrow () Expr
 expr = proc () -> do
         t <- term -< ()
         exprTail -< t

 exprTail :: ParseArrow Expr Expr
 exprTail = proc e -> do
         symbol PLUS -< ()
         t <- term   -< ()
         exprTail -< Plus e t
    <+> do
         symbol MINUS -< ()
         t <- term    -< ()
         exprTail -< Minus e t
    <+> returnA -< e

An arrow parser library: PArrows written by Einar Karttunen.

Another arrow parser implementation: LLParser.hs written by Antti-Juhani Kaijanaho (I read the reference to it in Shae Erisson's blog / journal).

The funny thing which took a long time for me to understand arrow parsers is a sort of differential approach -- in contrast to the well-known parser approaches. (I mean, in some way well-known parsers are of differential approach too, in the sense that they manage state transitions where the states are remainder streams -- but here I mean being differential in another sense: arrow parsers seem to me differential in the way how they consume and produce values -- their input and output.)

The idea of borrowing this image from mathematical analysis comes from another topic: the version control systems article Integrals and derivatives written by Martin Pool uses a similar image.

Arrows and Computation written by Ross Paterson (pages 2, 6, 7) and ProdArrows -- Arrows for Fudgets written by Magnus Carlsson (page 9) mentions that computation (e.g. state) is threaded through the operands of &&& operation. I mean, even the mere definition of &&& operation

 p &&& q = arr dup >>> first p >>> second q

shows that the order of the computation (the side effects) is important when using &&&, and this can be exemplified very well with parser arrows. See an example found in PArrows written by Einar Karttunen (see module Text.ParserCombinators.PArrow.Combinator):

 -- | Match zero or more occurrences of the given parser.
 many :: MD i o -> MD i [o]
 many = MStar

 -- | Match one or more occurrences of the given parser.
 many1 :: MD i o -> MD i [o]
 many1 x = (x &&& MStar x) >>> pure (\(b,bs) -> (b:bs))

The definition of between parser combinator can show another example for the importance of the order in which the computation (e.g. the side effects) take place using &&& operation:

 between :: MD i t -> MD t close -> MD t o -> MD i o
 between open close real = open >>> (real &&& close) >>^ fst

A more complicated example (from the same module):

 -- | Match one or more occurrences of the given parser separated by the separator.
 sepBy1 :: MD i o -> MD i o' -> MD i [o]
 sepBy1 p s = (many (p &&& s >>^ fst) &&& p) >>^ (\(bs,b) -> bs++[b])

This makes clear that the order of effects of the operands of &&& operation can be important. But let us mention also a counterexample, e.g. nondeterministic functions arrows, or more generally, the various implementations of binary relation arrows -- there is no such sequencing of effect orders. Now let us see this fact on the mere mathematical concept of binary relations (not minding how it implemented):

&&&
|||

The picture illustrating *** in Haskell/Understanding arrows article of Wikibooks suggests exactly such a view: order of side effects can be unimportant at some arrow instances, and the symmetry of the figure reflects this. In generally, however, the figure should use a notation for threading through side effects in a sequence.

Stream processor

The Lazy K programming language is an interesting esoteric language (from the family of pure, lazy functional languages), whose I/O concept is approached by streams.

Arrows are useful also to grasp the concept of stream processors. See details in

Functional I/O, graphical user interfaces

Dataflow languages

Arrows and Computation written by Ross Paterson mentions how to mimic dataflow programming in (lazy) functional languages. See more on Lucid's own HaskellWiki page: Lucid.

Automaton

To see what the concept itself means, see the Wikipedia articles Finite state machine and also Automata theory.

How these concepts can be implemented using the concept of arrow, can be found in the introductory articles on arrows mentioned above.

Haskell XML Toolbox

HXT is an example of a real application using Arrows

External links


See also