Difference between revisions of "Concurrency with oracles"

From HaskellWiki
Jump to: navigation, search
m (Formatting changes; added section regarding concurrency)
(Attempt to improve clarity of description...)
Line 1: Line 1:
 
== Oracles, defined ==
 
== Oracles, defined ==
   
An ''oracle'' is a value that "knows", by magic predictive power, which of two computations will finish first or which input event will arrive first, or whether a computation will finish before an input event arrives. In practice the predictive power is unnecessary, but the oracle, by seeming to contain the prediction, will preserve the [[Referential transparency|referential transparency]] of a language while allowing expression of computations whose outcomes depend on execution time and arrival time.
 
  +
An ''oracle'' is a value that can be viewed as having the ability to predict e.g:
  +
* which out of two computations will finish first;
  +
* which input event will arrive first;
  +
* whether a computation will finish before an input event arrives.
  +
 
In practice this apparent predictive power is the result of the measured use of outside information (possibly as a result of external effects). The oracle, by seeming to contain the prediction, will preserve the [[Referential transparency|referential transparency]] of a language while allowing expression of computations whose outcomes depend on execution time and arrival time.
   
 
Solutions tend to involve infinite trees of oracles, so you can pull one out whenever you need one, and pass an infinite subtree to future computations. Of course once an oracle has been used, it can't be reused. [[Referential transparency]] demands that the outcome of applying the oracle is fixed.
 
Solutions tend to involve infinite trees of oracles, so you can pull one out whenever you need one, and pass an infinite subtree to future computations. Of course once an oracle has been used, it can't be reused. [[Referential transparency]] demands that the outcome of applying the oracle is fixed.

Revision as of 22:49, 18 March 2021

Oracles, defined

An oracle is a value that can be viewed as having the ability to predict e.g:

  • which out of two computations will finish first;
  • which input event will arrive first;
  • whether a computation will finish before an input event arrives.

In practice this apparent predictive power is the result of the measured use of outside information (possibly as a result of external effects). The oracle, by seeming to contain the prediction, will preserve the referential transparency of a language while allowing expression of computations whose outcomes depend on execution time and arrival time.

Solutions tend to involve infinite trees of oracles, so you can pull one out whenever you need one, and pass an infinite subtree to future computations. Of course once an oracle has been used, it can't be reused. Referential transparency demands that the outcome of applying the oracle is fixed.

Connections to concurrency

On page 32 of 60 in Tackling the Awkward Squad, Simon Peyton Jones introduces non-determinism as a result of adding concurrency to the operational semantics he provides for I/O. As shown by Peter Dybjer, Herbert Sander and Mieke Massink, the throughtful use of oracles can help to recover referential transparency in models of concurrency despite them being non-deterministic.

References