Results 1  10
of
30
Lazy functional state threads
 In the ACM SIGPLAN Conference on Programming Language Design and Implementation
, 1994
"... Some algorithms make critical internal use of updatable state, even though their external specification is purely functional. Based on earlier work on monads, we present a way of securely encapsulating stateful computations that manipulate multiple, named, mutable objects, in the context of a nonst ..."
Abstract

Cited by 114 (11 self)
 Add to MetaCart
Some algorithms make critical internal use of updatable state, even though their external specification is purely functional. Based on earlier work on monads, we present a way of securely encapsulating stateful computations that manipulate multiple, named, mutable objects, in the context of a nonstrict, purelyfunctional language. The security of the encapsulation is assured by the type system, using parametricity. Intriguingly, this parametricity requires the provision of a (single) constant with a rank2 polymorphic type. 1
How to Declare an Imperative
, 1995
"... How can we integrate interaction into a purely declarative language? This tutorial describes a solution to this problem based on a monad. The solution has been implemented in the functional language Haskell and the declarative language Escher. Comparisons are given to other approaches to interaction ..."
Abstract

Cited by 103 (3 self)
 Add to MetaCart
How can we integrate interaction into a purely declarative language? This tutorial describes a solution to this problem based on a monad. The solution has been implemented in the functional language Haskell and the declarative language Escher. Comparisons are given to other approaches to interaction based on synchronous streams, continuations, linear logic, and side effects.
Merging Monads and Folds for Functional Programming
 In Advanced Functional Programming, LNCS 925
, 1995
"... . These notes discuss the simultaneous use of generalised fold operators and monads to structure functional programs. Generalised fold operators structure programs after the decomposition of the value they consume. Monads structure programs after the computation of the value they produce. Our progra ..."
Abstract

Cited by 50 (2 self)
 Add to MetaCart
(Show Context)
. These notes discuss the simultaneous use of generalised fold operators and monads to structure functional programs. Generalised fold operators structure programs after the decomposition of the value they consume. Monads structure programs after the computation of the value they produce. Our programs abstract both from the recursive processing of their input as well as from the sideeffects in computing their output. We show how generalised monadic folds aid in calculating an efficient graph reduction engine from an inefficient specification. 1 Introduction Should I structure my program after the decomposition of the value it consumes or after the computation of the value it produces? Some [Bir89, Mee86, Mal90, Jeu90, MFP91] argue in favour of structuring programs after the decomposition of the value they consume. Such syntax directed programs are written using a limited set of recursion functionals. These functionals, called catamorphisms or generalised fold operators are naturally ...
Recursive Monadic Bindings
, 2000
"... Monads have become a popular tool for dealing with computational effects in Haskell for two significant reasons: equational reasoning is retained even in the presence of effects; and program modularity is enhanced by hiding "plumbing" issues inside the monadic infrastructure. Unfortunately ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
Monads have become a popular tool for dealing with computational effects in Haskell for two significant reasons: equational reasoning is retained even in the presence of effects; and program modularity is enhanced by hiding "plumbing" issues inside the monadic infrastructure. Unfortunately, not all the facilities provided by the underlying language are readily available for monadic computations. In particular, while recursive monadic computations can be defined directly using Haskell's builtin recursion capabilities, there is no natural way to express recursion over the values of monadic actions. Using examples, we illustrate why this is a problem, and we propose an extension to Haskell's donotation to remedy the situation. It turns out that the structure of monadic valuerecursion depends on the structure of the underlying monad. We propose an axiomatization of the recursion operation and provide a catalogue of definitions that satisfy our criteria.
A Functional Theory of Local Names
, 1994
"... ## is an extension of the #calculus with a binding construct for local names. The extension has properties analogous to classical #calculus and preserves all observational equivalences of #. It is useful as a basis for modeling widespectrum languages that build on a functional core. 1 Introducti ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
## is an extension of the #calculus with a binding construct for local names. The extension has properties analogous to classical #calculus and preserves all observational equivalences of #. It is useful as a basis for modeling widespectrum languages that build on a functional core. 1 Introduction Recentyears have given us a good deal of theoretical research on the interaction of imperative programming #exempli#ed byvariable assignment# and functional programming #exempli#ed by higher order functions# #3,6,19,21, 24#. The common method of all these works is to propose a #calculus extended with imperative features and to carry out an exploration of the operational semantics of the new calculus. Based on our own experience in devising such an extended # calculus #13#, the presentwork singles out the name, whose only observational property is its identity, as an essential componentofany such extension. We present a simple extension of the pure #calculus with names; we showby ex...
Implicit and Explicit Parallel Programming in Haskell
, 1993
"... Abstract It has often been suggested that functional languages provide an excellent basis for programming parallel computer systems. This is largely a result of the lack of side effects which makes it possible to evaluate the subexpressions of a given term without any risk of interference. On the ot ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
(Show Context)
Abstract It has often been suggested that functional languages provide an excellent basis for programming parallel computer systems. This is largely a result of the lack of side effects which makes it possible to evaluate the subexpressions of a given term without any risk of interference. On the other hand, the lack of sideeffects has also been seen as a weakness of functional languages since it rules out many features of traditional imperative languages such as state, I/O and exceptions. These ideas can be simulated in a functional language but the resulting programs are sometimes unnatural and inefficient. On the bright side, recent work has shown how many of these features can be naturally incorporated into a functional language without compromising efficiency by expressing computations in terms of monads or continuations. Unfortunately, the &quot;singlethreading &quot; implied by these techniques often destroys many opportunities for parallelism. In this paper, we describe a simple extension to the Haskell I/O monad that allows a form of explicit highlevel concurrency. It is a simple matter to incorporate these features in a sequential implementation, and genuine parallelism can be obtained on a parallel machine. In addition, the inclusion of constructs for explicit concurrency enhances the use of Haskell as an executable specification language, since some programs are most naturally described as a composition of parallel processes. \Lambda This research was supported by ARPA via a subcontract to Intermetrics, Inc. 1
Structuring DepthFirst Search Algorithms in Haskell
, 1995
"... Depthfirst search is the key to a wide variety of graph algorithms. In this paper we express depthfirst search in a lazy functional language, obtaining a lineartime implementation. Unlike traditional imperative presentations, we use the structuring methods of functional languages to construct alg ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
(Show Context)
Depthfirst search is the key to a wide variety of graph algorithms. In this paper we express depthfirst search in a lazy functional language, obtaining a lineartime implementation. Unlike traditional imperative presentations, we use the structuring methods of functional languages to construct algorithms from individual reusable components. This style of algorithm construction turns out to be quite amenable to formal proof, which we exemplify through a calculationalstyle proof of a far from obvious stronglyconnected components algorithm. Classifications: Computing Paradigms (functional programming) ; Environments, Implementations, and Experience (programming, graph algorithms). 1 Introduction The importance of depthfirst search (DFS) for graph algorithms was established twenty years ago by Tarjan (1972) and Hopcroft and Tarjan (1973) in their seminal work. They demonstrated how depthfirst search could be used to construct a variety of efficient graph algorithms. In practice, this...
Correctness of Monadic State: An Imperative CallbyNeed Calculus
 In Proc. 25th ACM Symposium on Principles of Programming Languages
, 1998
"... The extension of Haskell with a builtin state monad combines mathematical elegance with operational efficiency: ffl Semantically, at the source language level, constructs that act on the state are viewed as functions that pass an explicit store data structure around. ffl Operationally, at the imp ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
The extension of Haskell with a builtin state monad combines mathematical elegance with operational efficiency: ffl Semantically, at the source language level, constructs that act on the state are viewed as functions that pass an explicit store data structure around. ffl Operationally, at the implementation level, constructs that act on the state are viewed as statements whose evaluation has the sideeffect of updating the implicit global store in place. There are several unproven conjectures that the two views are consistent. Recently, we have noted that the consistency of the two views is far from obvious: all it takes for the implementation to become unsound is one judiciouslyplaced betastep in the optimization phase of the compiler. This discovery motivates the current paper in which we formalize and show the correctness of the implementation of monadic state. For the proof, we first design a typed callbyneed language that models the intermediate language of the compiler, to...
Single Assignment C  Entwurf und Implementierung einer CVariante mit spezieller Unterstützung shapeinvarianter ArrayOperationen
, 1996
"... ..."
Value Recursion in Monadic Computations
 OGI School of Science and Engineering, OHSU
, 2002
"... viii 1 ..."
(Show Context)