Applications and libraries/Linguistics
1 Portals and other huge resources
Jan van Eijck's page contains a huge amount of materials on logic and language:
- computational linguistics
- logics (e.g. dynamic epistemic modelling)
The Haskell NLP projects provides a mailing list for Haskellers doing NLP work, as well as a community wiki and darcs repository. Come join us!
Natural Language Processing for The Working Programmer is a book that provides an introduction to Haskell and NLP.
There are many Haskell resources, too.
2 Tools and libraries
- Cypher is one of the first software program available which generates the metadata representation of natural language input. Cypher produces RDF graph and SeRQL query representations of sentences, clauses, phrases and questions. The Cypher framework provides a set of robust definition languages, which can be used to extend and create grammars and lexicons. Cypher programming is fun to learn and easy to use, and the specifications are designed to allow a novice to quickly and easily build transcoders for processing highly complex sentences and phrases of any natural language, and to cover any vocabulary
- GenI is a surface realiser for Tree Adjoining Grammars. Surface realisation can be seen as the last stage in a natural language generation pipeline. GenI in particular takes an FB-LTAG grammar and an input semantics (a conjunction of first order terms), and produces the set of sentences associated to the input semantics by the grammar. See also Eric Kow's recent publications on it.
- Grammatical Framework (GF) is a compiler and grammatical programming environment written entirely in Haskell, with an interactive interpreter and two GUI interfaces, one written in Fudgets and another written in Java. GF grammars are written in a subset of Haskell and compile into an internal GF format that may be used as embedded parsers in Haskell, parsers in Java (with an embedded Java interpreter gfc2java.jar) and subsequently converted to applets (Gramlets). (GF-Haskell to Java translation is performed through an Open Agent Architecture--the original .NET, see GF OAA.) The GF grammatical formalism handles linguistic entities (morphemes, etc.) using type theory: an approach especially suited to machine translation of controlled natural languages. The Grammar Resource Library, a set of basic grammars for Danish, English, Finnish, French, German, Italian, Norwegian, Russian, Spanish and Swedish, is available as a separate download. GF has been used to translate a fragment of C code to JVM (see GFCC (PDF document)).
- Functional Morphology - a toolkit for morphology development. Has been used for Swedish, Spanish, Urdu and more.
- Saxophone is a fun translator from German to the Saxon dialect. It is part of the ParallelWeb project which aims at translating Web pages including all of their links.
2.1 Data interfaces
- nlp-scores provides scoring functions commonly used for evaluation in NLP and IR
- brillig is an incomplete implementation of Brill's transformation rule tagger (see also the chapter in Haskell NLP for the Working Programmer)
- morfette is a tool for supervised learning of inflectional morphology (tags and lemmas); see also delta-h
- HaLeX is a library of datatypes and functions implemented in Haskell that allows us to model, manipulate and animate regular languages as finite state machines (originally for educational purposes)
- bytestring-trie is an efficient and fast way of storing and looking up strings as keys, assigned to values of some arbitrary type
2.5 Learning algorithms
- sequor is a sequence labeler based on Collins's sequence perceptron, useful for Part-of-Speech tagging
- progressive is a multilabel classification model which learns sequentially (online)
- lda is an Online Gibbs sampler for Latent Dirichlet Allocation, useful for probabilistic soft word class induction, see also colada
- vowpal-utils is a wrapper around vw (Utilities for interpreting models produced by Vowpal Wabbit)
2.6 Machine translation
- hs-gizapp is a wrapper around GIZA++
- Grammatical Framework is used for machine translation, among many other things
3 Natural language processing and combinatory logic
Combinatory logic contributed to develop powerful theories in linguistics..
3.1 Applicative universal grammar
Now it has got its own HaskellWiki page.
3.2 Categorial grammar
A general summary of modern semantic theories developed in the century is provided by Logical Aspects of Computational Linguistics: an introduction.
Gary Hardegree's portal-rich page provides a lot of materials on logic and linguistics, among them
- The Axiomatic Theory of Truth grasping concepts like truth, quotations, paradoxes, liar's paradox
- Courses ranging from the introductory level to developed topics, e.g. Basic Categorial Grammar.
On natural languages relating to combinatory logic, see also
- Mark Steedman's Does Grammar Make Use of Bound Variables?
- Mark Hepple: The Grammar and Processing of Order and Dependency: a Categorial Approach
3.3 Type-Logical Grammar
Matteo Capelletti's home page contains a parser based on the Non-associative Lambek calculus. It supports hypothetical reasoning and Montague style semantics.
3.4 Tree Adjoining Grammar
- See GenI, mentioned above.
4 Game theoretic semantics
Game theoretic semantics presents an interesting concept of truth -- in another way than that of Tarski. Its connections to computer science and computer languages is described in Wikipedia's Game semantics article. Merlijn Sevenster's Game theoretical semantics and -logic is a good introductory material too.
Chiaki Ohkura's The Semantics of Metaphor in the Game Theoretic Semantics with at Least Two Coordination Equilibria article tries to catch the concept of metaphor.
4.1 Relatedness to linear logic
The Wikipedia article mentions also the relatedness of game theoretic semantics to linear logic. Philip Wadler's page on linear logic describes the topic and its relatedness to many concepts concerning Haskell. A taste of linear logic can serve as an introductory article.
5 Parsing natural languages
5.1 Parsing Natural Language with X-SAIGA parser
The goal of the X-SAIGA project is to create algorithms and implementations which enable the construction of language processors (recognizers, parsers, interpreters, translators, etc.) to be constructed as modular and efficient embedded executable specifications of grammars. The syntax analysis is done with a set of parser combinators by overcoming some long standing limitations -
- the simple implementations of parser combinators require exponential time and space when parsing an ambiguous context free grammar.
- like any top-down recursive descent parsing, the conventional parser combinators won't terminate while processing a left-recursive grammar (i.e.
s ::= s *> s *> term 'x'|empty).
As a part of the X-SAIGA project's syntax analysis, a recognition algorithm that accommodates ambiguous grammars with direct left-recursive rules is described by Frost and Hafiz in 2006. The algorithm curtails the otherwise ever-growing left-recursive parse by imposing depth restrictions. That algorithm was extended to a complete parsing algorithm to accommodate indirect as well as direct left-recursion in polynomial time, and to generate compact polynomial-size representations of the potentially-exponential number of parse trees for highly-ambiguous grammars by Frost, Hafiz and Callaghan in 2007. This extended algorithm accommodates indirect left-recursion by comparing its 'computed-context' with 'current-context'. The same authors also described their implementation of a set of parser combinators written in the Haskell programming language based on the same algorithm in PADL 08. The X-SAIGA site has more about the algorithms, implementation details experimental results.
5.2 Monadic Compositional Parsing
Gordon J. Pace: Monadic Compositional Parsing with Context Using Maltese as a Case Study, see its context too.
- A Survey on the Use of Haskell in Natural-Language Processing (Report by Richard A. Frost). It is also a part of Haskell Communities and Activities Report, Eleventh edition – November 30, 2006.
- A History of Haskell: Being Lazy With Class (2007) has a section (11.5 on page 40) with material contributed by Paul Callaghan on applications of Haskell to natural-language processing. Perhaps, there are some projects mentioned there which are not (yet) listed here on this page; so take a look.
- From Aarne Ranta's homepage
- Natural Language Technology, with (among others) online course slides. They give huge insights, for example, see the slide example which discusses the concept of dependent type and Curry Howard isomorphism in lingustical context.
- The Zen Computational Linguistics Toolkit has tools for efficiently processing linguistic data structures, like trees and automata. It's written in Literate O'Caml, though a Haskell port shouldn't be very hard to do.
- The natural language processing blog written by Hal Daume III.
7 Specific topics
Lojban, an artificial language (see a separate HaskellWiki page on it with references.) “Lojban was not designed primarily to be an international language, however, but rather as a linguistic tool for studying and understanding language. Its linguistic and computer applications make Lojban unique among international languages...” (NC:WhLoj, page 15 par 1)
7.2 Continuations in natural languages
- Barker, Chris: Continuations in Natural Language (pdf), 2004
- Nicholas, Nick and Cowan, John (ed.): What is Lojban? Logical Language Group, 2003. Available also online.
- Frost, Richard: Realization of natural language interfaces using lazy functional programming (pdf), 2006
- Frost, Richard; Hafiz, Rahmatullah and Callaghan, Paul: Parser Combinators for Ambiguous Left-Recursive Grammars. Proceedings of the 10th International Symposium on Practical Aspects of Declarative Languages (PADL), ACM-SIGPLAN. January 2008, San Francisco, USA.
- Frost, Richard; Hafiz, Rahmatullah and Callaghan, Paul: X-SAIGA website - eXecutable SpecificAtIons of GrAmmars.