Results 11  20
of
33
Monadic augment and generalised short cut fusion
 Journal of Functional Programming
, 2005
"... Monads are commonplace programming devices that are used to uniformly structure computations with effects such as state, exceptions, and I/O. This paper further develops the monadic programming paradigm by investigating the extent to which monadic computations can be optimised by using generalisatio ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
Monads are commonplace programming devices that are used to uniformly structure computations with effects such as state, exceptions, and I/O. This paper further develops the monadic programming paradigm by investigating the extent to which monadic computations can be optimised by using generalisations of short cut fusion to eliminate monadic structures whose sole purpose is to “glue together ” monadic program components. We make several contributions. First, we show that every inductive type has an associated build combinator and an associated short cut fusion rule. Second, we introduce the notion of an inductive monad to describe those monads that give rise to inductive types, and we give examples of such monads which are widely used in functional programming. Third, we generalise the standard augment combinators and cata/augment fusion rules for algebraic data types to types induced by inductive monads. This allows us to give the first cata/augment rules for some common data types, such as rose trees. Fourth, we demonstrate the practical applicability of our generalisations by providing Haskell implementations for all concepts and examples in the paper. Finally, we offer deep theoretical insights by showing that the augment combinators are monadic in nature, and thus that our cata/build and cata/augment rules are arguably the best generally applicable fusion rules obtainable.
When is a function a fold or an unfold
 Coalgebraic Methods in Computer Science, number 44.1 in Electronic Notes in Theoretical Computer Science
, 2001
"... We give a necessary and sufficient condition for when a settheoretic function can be written using the recursion operator fold, and a dual condition for the recursion operator unfold. The conditions are simple, practically useful, and generic in the underlying datatype. 1 ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We give a necessary and sufficient condition for when a settheoretic function can be written using the recursion operator fold, and a dual condition for the recursion operator unfold. The conditions are simple, practically useful, and generic in the underlying datatype. 1
Pointfree Program Transformation
 Fundamenta Informaticae
, 2005
"... Abstract. The subject of this paper is functional program transformation in the socalled pointfree style. By this we mean first translating programs to a form consisting only of categoricallyinspired combinators, algebraic data types defined as fixed points of functors, and implicit recursion thr ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
Abstract. The subject of this paper is functional program transformation in the socalled pointfree style. By this we mean first translating programs to a form consisting only of categoricallyinspired combinators, algebraic data types defined as fixed points of functors, and implicit recursion through the use of typeparameterized recursion patterns. This form is appropriate for reasoning about programs equationally, but difficult to actually use in practice for programming. In this paper we present a collection of libraries and tools developed at Minho with the aim of supporting the automatic conversion of programs to pointfree (embedded in Haskell), their manipulation and ruledriven simplification, and the (limited) automatic application of fusion for program transformation. 1
Better Consumers for Program Specializations
, 1996
"... It is well known that not all programs are susceptible to automatic program specialization. Traditionally, complicated analyses are performed before actual specialization, in order to uncover as much of the useful program properties as possible. This is particularly the case for automatic program tr ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
It is well known that not all programs are susceptible to automatic program specialization. Traditionally, complicated analyses are performed before actual specialization, in order to uncover as much of the useful program properties as possible. This is particularly the case for automatic program transformers that specialize function calls with arguments containing only constructors and variables. We describe a novel approach for achieving better program specialization by preprocessing a program before subjecting it to actual specialization. The preprocessing phase involves simple syntactic analyses and program transformation, which is based on the wellunderstood fold/unfold strategy with generalization on terms. We ensure the termination of the transformation used in this phase, and outline a proof of its total correctness. Our approach greatly simplifies the task of program specialization in the later stage. Compared to other existing semanticsbased approaches, our syntaxbased method is considerably simpler, yet still widely applicable. Our approach is formulated for nonstrict firstorder programs. It can help obtain programs that are more susceptible to a variety of program specializers, including partial evaluation, deforestation, and the elimination of repeated pattern testing.
Construction of List Homomorphisms by Tupling and Fusion
, 1996
"... List homomorphisms are functions which can be efficiently computed in parallel since they ideally suit the divideandconquer paradigm. However, some interesting functions, e.g., the maximum segment sum problem, are not list homomorphisms. In this paper, we propose a systematic way of embedding them ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
List homomorphisms are functions which can be efficiently computed in parallel since they ideally suit the divideandconquer paradigm. However, some interesting functions, e.g., the maximum segment sum problem, are not list homomorphisms. In this paper, we propose a systematic way of embedding them into list homomorphisms so that parallel programs are derived. We show, with an example, how a simple, and "obviously" correct, but possibly inefficient solution to the problem can be successfully turned into a semantically equivalent almost homomorphism by means of two transformations: tupling and fusion.
Deriving Parallel Codes via Invariants
"... . Systematic parallelization of sequential programs remains a major challenge in parallel computing. Traditional approaches using program schemes tend to be narrower in scope, as the properties which enable parallelism are difficult to capture via adhoc schemes. In [CTH98], a systematic approac ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
. Systematic parallelization of sequential programs remains a major challenge in parallel computing. Traditional approaches using program schemes tend to be narrower in scope, as the properties which enable parallelism are difficult to capture via adhoc schemes. In [CTH98], a systematic approach to parallelization based on the notion of preserving the context of recursive subterms has been proposed. This approach can be used to derive a class of divideandconquer algorithms. In this paper, we enhance the methodology by using invariants to guide the parallelization process. The enhancement enables the parallelization of a class of recursive functions with conditional and tupled constructs, which were not possible previously. We further show how such invariants can be discovered and verified systematically, and demonstrate the power of our methodology by deriving a parallel code for maximum segment product. To the best of our knowledge, this is the first systematic parall...
Program Transformation in Calculational Form
, 1998
"... Correctnesspreserving program transformation has recently received a particular attention for compiler optimization in functional programming [Kelsey and Hudak 1989; Appel 1992; Peyton Jones 1996]. By implementing a compiler using many passes, each of which is a transformation for a particular opti ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Correctnesspreserving program transformation has recently received a particular attention for compiler optimization in functional programming [Kelsey and Hudak 1989; Appel 1992; Peyton Jones 1996]. By implementing a compiler using many passes, each of which is a transformation for a particular optimization, one can attain a modular compiler. It is no surprise that the modularity would increase if transformations are structured, i.e. constructed in a modular way. Indeed, the program transformation in calculational form (or program calculation) can help us to attain this goal.
An Extension Of The Acid Rain Theorem
 In T Ida, A Ohori, and M Takeichi, eds, Proceedings 2nd Fuji Int Workshop on Functional and Logic Programming, Shonan Village
, 1996
"... Program fusion (or deforestation) is a wellknown transformation whereby compositions of several pieces of code are fused into a single one, resulting in an efficient functional program without intermediate data structures. Recent work has made it clear that fusion transformation is especially succe ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Program fusion (or deforestation) is a wellknown transformation whereby compositions of several pieces of code are fused into a single one, resulting in an efficient functional program without intermediate data structures. Recent work has made it clear that fusion transformation is especially successful if recursions are expressed in terms of hylomorphisms . The point of this success is that fusion transformation proceeds merely based on a simple but effective rule called Acid Rain Theorem [10]. However, there remains a problem. The Acid Rain Theorem can only handle hylomorphisms inducting over a single data structure. For hylomorphisms, like zip, which induct over multiple data structures, it will leave some of the data structures remained which should be removed. In this paper, we extend the Acid Rain Theorem so that it can deal with such hylomorphisms, enabling more intermediate data structures to be eliminated. 1. Introduction Functional programming constructs a complex program ...
Short cut fusion: Proved and improved
 Pages 47–71 of: Semantics, Applications, and Implementation of Program Generation
, 2001
"... Abstract. Short cut fusion is a particular program transformation technique which uses a single, local transformation — called the foldrbuild rule — to remove certain intermediate lists from modularly constructed functional programs. Arguments that short cut fusion is correct typically appeal eithe ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. Short cut fusion is a particular program transformation technique which uses a single, local transformation — called the foldrbuild rule — to remove certain intermediate lists from modularly constructed functional programs. Arguments that short cut fusion is correct typically appeal either to intuition or to “free theorems ” — even though the latter have not been known to hold for the languages supporting higherorder polymorphic functions and fixed point recursion in which short cut fusion is usually applied. In this paper we use Pitts ’ recent demonstration that contextual equivalence in such languages is relationally parametric to prove that programs in them which have undergone short cut fusion are contextually equivalent to their unfused counterparts. The same techniques in fact yield a much more general result. For each algebraic data type we define a generalization augment of build which constructs substitution instances of its associated data structures. Together with the wellknown generalization cata of foldr to arbitrary algebraic data types, this allows us to formulate and prove correct for each a contextual equivalencepreserving cataaugment fusion rule. These rules optimize compositions of functions that uniformly consume algebraic data structures with functions that uniformly produce substitution instances of them. 1
Using the Parametricity Theorem for Program Fusion
, 1996
"... Program fusion techniques have long been proposed as an effective means of improving program performance and of eliminating unnecessary intermediate data structures. This paper proposes a new approach on program fusion that is based entirely on the type signatures of programs. First, for each functi ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Program fusion techniques have long been proposed as an effective means of improving program performance and of eliminating unnecessary intermediate data structures. This paper proposes a new approach on program fusion that is based entirely on the type signatures of programs. First, for each function, a recursive skeleton is extracted that captures its pattern of recursion. Then, the parametricity theorem of this skeleton is derived, which provides a rule for fusing this function with any function. This method generalizes other approaches that use fixed parametricity theorems to fuse programs. 1 Introduction There is much work recently on using higherorder operators, such as fold [11] and build [8, 5], to automate program fusion [2] and deforestation [13]. Even though these methods do a good job on fusing programs, they are only effective if programs are expressed in terms of these operators. This limits their applicability to conventional functional languages. To ameliorate this pr...