Results 1  10
of
12
Shortcut Deforestation in Calculational Form
 In Proc. Conference on Functional Programming Languages and Computer Architecture
, 1995
"... In functional programming, intermediate data structures are often used to "glue" together small programs. Deforestation is a program transformation to remove these intermediate data structures automatically. We present a simple algorithm for deforestation based on two fusion rules for hylomorphism, ..."
Abstract

Cited by 91 (3 self)
 Add to MetaCart
In functional programming, intermediate data structures are often used to "glue" together small programs. Deforestation is a program transformation to remove these intermediate data structures automatically. We present a simple algorithm for deforestation based on two fusion rules for hylomorphism, an expressive recursion pattern. A generic notation for hylomorphisms is introduced, where natural transformations are explicitly factored out, and it is used to represent programs. Our method successfully eliminates intermediate data structures of any algebraic type from a much larger class of compositional functional programs than previous techniques. 1 Introduction In functional programming, programs are often constructed by "gluing" together small components, using intermediate data structures to convey information between them. Such data are constructed in one component and later consumed in another component, but never appear in the result of the whole program. The compositional styl...
Formal Language, Grammar and SetConstraintBased Program Analysis by Abstract Interpretation
, 1995
"... Grammarbased program analysis à la Jones and Muchnick and setconstraintbased program analysis à la Aiken and Heintze are static analysis techniques that have traditionally been seen as quite different from abstractinterpretationbased analyses, in particular because of their apparent noniterati ..."
Abstract

Cited by 71 (10 self)
 Add to MetaCart
Grammarbased program analysis à la Jones and Muchnick and setconstraintbased program analysis à la Aiken and Heintze are static analysis techniques that have traditionally been seen as quite different from abstractinterpretationbased analyses, in particular because of their apparent noniterative nature. For example, on page 18 of N. Heintze thesis, it is alleged that ``The finitary nature of abstract interpretation implies that there is a fundamental limitation on the accuracy of this approach to program analysis. There are decidable kinds of analysis that cannot be computed using abstract interpretation (even with widening and narrowing). The setbased analysis considered in this thesis is one example''. On the contrary, we show that grammar and setconstraintbased program analyses are similar abstract interpretations with iterative fixpoint computation using either a widening or a finitary grammar/setconstraints transformer or even a finite domain for each particular program. The understanding of grammarbased and setconstraintbased program analysis as a particular instance of abstract interpretation of a semantics has several advantages. First, the approximation process is formalized and not only explained using examples. Second, a domain of abstract properties is exhibited which is of general scope. Third, these analyses can be easily combined with other abstractinterpretationbased analyses, in particular for the analysis of numerical values. Fourth, they can be generalized to very powerful attributedependent and contextdependent analyses. Finally, a few misunderstandings may be removed.
Partial Deduction and Driving are Equivalent
, 1994
"... Partial deduction and driving are two methods used for program specialization in logic and functional languages, respectively. We argue that both techniques achieve essentially the same transformational effect by unificationbased information propagation. We show their equivalence by analyzing the ..."
Abstract

Cited by 46 (10 self)
 Add to MetaCart
Partial deduction and driving are two methods used for program specialization in logic and functional languages, respectively. We argue that both techniques achieve essentially the same transformational effect by unificationbased information propagation. We show their equivalence by analyzing the definition and construction principles underlying partial deduction and driving, and by giving a translation from a functional language to a definite logic language preserving certain properties. We discuss residual program generation, termination issues, and related other techniques developed for program specialization in logic and functional languages.
Proving the Correctness of RecursionBased Automatic Program Transformations
 Theoretical Computer Science
, 1996
"... This paper shows how the Improvement Theorema semantic condition ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
This paper shows how the Improvement Theorema semantic condition
Towards Unifying Partial Evaluation, Deforestation, Supercompilation, and GPC
, 1994
"... We study four transformation methodologies which are automatic instances of Burstall and Darlington's fold/unfold framework: partial evaluation, deforestation, supercompilation, and generalized partial computation (GPC). One can classify these and other fold/unfold based transformers by how much inf ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
We study four transformation methodologies which are automatic instances of Burstall and Darlington's fold/unfold framework: partial evaluation, deforestation, supercompilation, and generalized partial computation (GPC). One can classify these and other fold/unfold based transformers by how much information they maintain during transformation. We introduce the positive supercompiler, a version of deforestation including more information propagation, to study such a classification in detail. Via the study of positive supercompilation we are able to show that partial evaluation and deforestation have simple information propagation, positive supercompilation has more information propagation, and supercompilation and GPC have even more information propagation. The amount of information propagation is significant: positive supercompilation, GPC, and supercompilation can specialize a general pattern matcher to a fixed pattern so as to obtain efficient output similar to that of the KnuthMorrisPratt algorithm. In the case of partial evaluation and deforestation, the general matcher must be rewritten to achieve this.
Two for the Price of One: Composing Partial Evaluation and Compilation
, 1997
"... One of the flagship applications of partial evaluation is compilation and compiler generation. However, partial evaluation is usually expressed as a sourcetosource transformation for highlevel languages, whereas realistic compilers produce object code. We close this gap by composing a partial eva ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
One of the flagship applications of partial evaluation is compilation and compiler generation. However, partial evaluation is usually expressed as a sourcetosource transformation for highlevel languages, whereas realistic compilers produce object code. We close this gap by composing a partial evaluator with a compiler by automatic means. Our work is a successful application of several metacomputation techniques to build the system, both in theory and in practice. The composition is an application of deforestation or fusion. The result is a runtime code generation system built from existing components. Its applications are numerous. For example, it allows the language designer to perform interpreterbased experiments with a sourcetosource version of the partial evaluator before building a realistic compiler which generates object code automatically.
Turchin's Supercompiler Revisited  An operational theory of positive information propagation
, 1996
"... Turchin`s supercompiler is a program transformer that includes both partial evaluation and deforestation. Although known in the West since 1979, the essence of its techniques, its more precise relations to other transformers, and the properties of the programs that it produces are only now becoming ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
Turchin`s supercompiler is a program transformer that includes both partial evaluation and deforestation. Although known in the West since 1979, the essence of its techniques, its more precise relations to other transformers, and the properties of the programs that it produces are only now becoming apparent in the Western functional programming community. This thesis gives a new formulation of the supercompiler in familiar terms; we study the essence of it, how it achieves its effects, and its relations to related transformers; and we develop results dealing with the problems of preserving semantics, assessing the efficiency of transformed programs, and ensuring termination.
Eliminating dead code on recursive data
 Science of Computer Programming
, 1999
"... Abstract. This paper describes a general and powerful method for dead code analysis and elimination in the presence of recursive data constructions. We represent partially dead recursive data using liveness patterns based on general regular tree grammars extended with the notion of live and dead, an ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Abstract. This paper describes a general and powerful method for dead code analysis and elimination in the presence of recursive data constructions. We represent partially dead recursive data using liveness patterns based on general regular tree grammars extended with the notion of live and dead, and we formulate the analysis as computing liveness patterns at all program points based on program semantics. This analysis yields a most precise liveness pattern for the data at each program point, which is signi cantly more precise than results from previous methods. The analysis algorithm takes cubic time in terms of the size of the program in the worst case but is very e cient in practice, as shown by our prototype implementation. The analysis results are used to identify and eliminate dead code. The general framework for representing and analyzing properties of recursive data structures using general regular tree grammars applies to other analyses as well. 1
Constraints to Stop HigherOrder Deforestation
 In 24th ACM Symposium on Principles of Programming Languages
, 1997
"... Wadler's deforestation algorithm eliminates intermediate data structures from functional programs. To be suitable for inclusion in a compiler, it must terminate on all programs. Several techniques to ensure termination of deforestation on all firstorder programs are known, but a technique for highe ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Wadler's deforestation algorithm eliminates intermediate data structures from functional programs. To be suitable for inclusion in a compiler, it must terminate on all programs. Several techniques to ensure termination of deforestation on all firstorder programs are known, but a technique for higherorder programs was only recently introduced by Hamilton, and elaborated and implemented in the Glasgow Haskell compiler by Marlow. We introduce a new technique for ensuring termination of deforestation on all higherorder programs that allows useful transformation steps prohibited in Hamilton's and Marlowe's techniques. 1 Introduction Lazy, higherorder, functional programming languages lend themselves to a certain style of programming which uses intermediate data structures [28]. Example 1 Consider the following program. letrec a = x; y:case x of [] ! y (h : t) ! h : a t y in u; v; w: a (a u v) w The term u; v; w:a (a u v) w appends the three lists u, v, and w. Appending u and v ...
Integer Constraints to Stop Deforestation
, 1996
"... . Deforestation is a transformation of functional programs to remove intermediate data structures. It is based on outermost unfolding of function calls where folding occurs when unfolding takes place within the same nested function call. Since unrestricted unfolding may encounter arbitrarily man ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
. Deforestation is a transformation of functional programs to remove intermediate data structures. It is based on outermost unfolding of function calls where folding occurs when unfolding takes place within the same nested function call. Since unrestricted unfolding may encounter arbitrarily many terms, a termination analysis has to determine those subterms where unfolding is possibly dangerous. We show that such an analysis can be obtained from a control flow analysis by an extension with integer constraints  essentially at no loss in efficiency. 1 Introduction The key idea of flow analysis for functional languages is to define an abstract meaning in terms of program points , i.e., subexpressions of the program possibly evaluated during program execution [Pa95]. Such analysises have been invented for tasks like type recovery [Sh91], binding time analysis [Co93], or safety analysis [PS95]. Conceptually, these are closely related to A. Deutsch's storebased alias analysis [D...