Results 1  10
of
10
Generic Program Transformation
 Proc. 3rd International Summer School on Advanced Functional Programming, LNCS 1608
, 1998
"... ion versus efficiency For concreteness, let us first examine a number of examples of the type of optimisation that we wish to capture, and the kind of programs on which they operate. This will give us a specific aim when developing the machinery for automating the process, and a yardstick for evalu ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
ion versus efficiency For concreteness, let us first examine a number of examples of the type of optimisation that we wish to capture, and the kind of programs on which they operate. This will give us a specific aim when developing the machinery for automating the process, and a yardstick for evaluating our results. 2.1 Minimum depth of a tree Consider the data type of leaf labelled binary trees: dataBtreea = Leaf a j Bin (Btree a)(Btree a) The minimum depth of such a tree is returned by the function mindepth :: Btree a ! Int : mindepth (Leaf a) = 0 mindepth (Bin s t) = min (mindepth s)(mindepth t) + 1 This program is clear, but rather inefficient. It traverses the whole tree, regardless of leaves that may occur at a small depth. A better program would keep track of the `minimum depth so far', and never explore subtrees beyond that current best solution. One possible implementation of that idea is mindepth t = md t 01 md (Leaf a)d m = mindm md (Bin s t)d m = if d 0 m then m...
Dynamic programming via static incrementalization
 In Proceedings of the 8th European Symposium on Programming
, 1999
"... Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dyn ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dynamic programming algorithm solves every subsubproblem just once, saves the result, reuses it when the subsubproblem is encountered again, and takes polynomial time. This paper describes a systematic method for transforming programs written as straightforward recursions into programs that use dynamic programming. The method extends the original program to cache all possibly computed values, incrementalizes the extended program with respect to an input increment to use and maintain all cached results, prunes out cached results that are not used in the incremental computation, and uses the resulting incremental program to form an optimized new program. Incrementalization statically exploits semantics of both control structures and data structures and maintains as invariants equalities characterizing cached results. The principle underlying incrementalization is general for achieving drastic program speedups. Compared with previous methods that perform memoization or tabulation, the method based on incrementalization is more powerful and systematic. It has been implemented and applied to numerous problems and succeeded on all of them. 1
Generic Downwards Accumulations
 Science of Computer Programming
, 2000
"... . A downwards accumulation is a higherorder operation that distributes information downwards through a data structure, from the root towards the leaves. The concept was originally introduced in an ad hoc way for just a couple of kinds of tree. We generalize the concept to an arbitrary regular d ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
. A downwards accumulation is a higherorder operation that distributes information downwards through a data structure, from the root towards the leaves. The concept was originally introduced in an ad hoc way for just a couple of kinds of tree. We generalize the concept to an arbitrary regular datatype; the resulting denition is coinductive. 1 Introduction The notion of scans or accumulations on lists is well known, and has proved very fruitful for expressing and calculating with programs involving lists [4]. Gibbons [7, 8] generalizes the notion of accumulation to various kinds of tree; that generalization too has proved fruitful, underlying the derivations of a number of tree algorithms, such as the parallel prex algorithm for prex sums [15, 8], Reingold and Tilford's algorithm for drawing trees tidily [21, 9], and algorithms for query evaluation in structured text [16, 23]. There are two varieties of accumulation on lists: leftwards and rightwards. Leftwards accumulation ...
The Automated Transformation of Abstract Specifications of Numerical Algorithms into Efficient Array Processor Implementations
 Science of Computer Programming
, 1997
"... We present a set of program transformations which are applied automatically to convert abstract functional specifications of numerical algorithms into efficient implementations tailored to the AMT DAP array processor. The transformations are based upon a formal algebra of a functional array form, wh ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
We present a set of program transformations which are applied automatically to convert abstract functional specifications of numerical algorithms into efficient implementations tailored to the AMT DAP array processor. The transformations are based upon a formal algebra of a functional array form, which provides a functional model of the array operations supported by the DAP programming language. The transformations are shown to be complete. We present specifications and derivations of two example algorithms: an algorithm for computing eigensystems and an algorithm for solving systems of linear equations. For the former, we compare the execution performance of the implementation derived by transformation with the performance of an independent, manually constructed implementation; the efficiency of the derived implementation matches that of the manually constructed implementation.
The Many Disguises of Accumulation
, 1991
"... Several descriptions of basically one transformation technique, viz. accumulation, are compared. Their basis, viz. the associativity and the existence of a neutral element inherent in a monoid, is identified. Keywords transformational programming, factorial, fast reverse, accumulation, continuation ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Several descriptions of basically one transformation technique, viz. accumulation, are compared. Their basis, viz. the associativity and the existence of a neutral element inherent in a monoid, is identified. Keywords transformational programming, factorial, fast reverse, accumulation, continuations, lambda abstraction, generalisation, tail recursion, implementation of lists. This research has been sponsored by the Netherlands Organisation for Scientific Research (NWO), under grant NF 63/62518 (the STOP  Specification and Transformation Of Programs  project). 1 Introduction One of the first program transformations that appeared in the literature was the accumulation transformation. The transformation is now classic, although not everyone may know it under exactly this name. In this note, I try to relate several descriptions of this program transformation technique. In a purely algebraic view, it is the exploitation of the properties of a monoid. In literature, it can be fou...
Swapping arguments and results of recursive functions
 In Mathematics of Program Construction, Proceedings, volume 4014 of LNCS
, 2006
"... Abstract. Many useful calculation rules, such as fusion and tupling, rely on wellstructured functions, especially in terms of inputs and outputs. For instance, fusion requires that wellproduced outputs should be connected to wellconsumed inputs, so that unnecessary intermediate data structures ca ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. Many useful calculation rules, such as fusion and tupling, rely on wellstructured functions, especially in terms of inputs and outputs. For instance, fusion requires that wellproduced outputs should be connected to wellconsumed inputs, so that unnecessary intermediate data structures can be eliminated. These calculation rules generally fail to work unless functions are wellstructured. In this paper, we propose a new calculation rule called IO swapping. IO swapping exchanges calltime computations (occurring in the arguments) and returntime computations (occurring in the results) of a function, while guaranteeing that the original and resulting function compute the same value. IO swapping enables us to rearrange inputs and outputs so that the existing calculation rules can be applied. We present new systematic derivations of efficient programs for detecting palindromes, and a method of higherorder removal that can be applied to defunctionalize function arguments, as two concrete applications. 1
An Algebra for Deriving Efficient Implementations for an Array Processor Parallel Computer from Functional Specifications
, 1995
"... We present a set of program transformations which are applied automatically to convert an abstract functional specification of numerical algorithms into efficient implementations tailored to the AMT DAP array processor. The transformations are based upon a formal algebra of a functional array form, ..."
Abstract
 Add to MetaCart
We present a set of program transformations which are applied automatically to convert an abstract functional specification of numerical algorithms into efficient implementations tailored to the AMT DAP array processor. The transformations are based upon a formal algebra of a functional array form, which provides a functional model of the array operations supported by the DAP programming language. The transformations are shown to be complete. We present specifications and derivations of two example algorithms: an algorithm for computing eigensystems and an algorithm for solving systems of linear equations. For the former, we compare the execution performance of the implementation derived by transformation with the performance of an independent, manually constructed implementation; the efficiency of the derived implementation matches that of the manually constructed implementation.
Efficient Callbyvalue Evaluation Strategy of Primitive Recursive Program Schemes
 Fuji International Workshop on Functional and Logic Programming, Susono Japan, Proceedings
, 1995
"... We consider primitive recursive program schemes with parameters together with the callby value computation rule. The schemes are finite systems of functions which are defined by primitive (or: structural) recursion; simultaneous recursion and nesting of function calls is allowed. We present a trans ..."
Abstract
 Add to MetaCart
We consider primitive recursive program schemes with parameters together with the callby value computation rule. The schemes are finite systems of functions which are defined by primitive (or: structural) recursion; simultaneous recursion and nesting of function calls is allowed. We present a transformation strategy which replaces primitive recursion by iteration. The transformation strategy which is fully automatic, takes as input a primitive recursive program scheme M with parameters and it computes a program scheme M 0 as output. We prove that, for every argument tuple, M 0 is at least as (time) efficient as M . We also prove that there are infinitely many nontrivial primitive recursive program schemes M with parameters for which the transformation yields a program scheme M 0 such that there are infinitely many argument tuples for which M 0 is more efficient than M . Moreover, we provide an algorithm which decides for an arbitrary given primitive recursive program scheme M...
Formal Software Development using Generic Development Steps
, 1999
"... This talk is concerned with a mechanized formal treatment of the transformational software development process in a unified framework. As a formal vehicle, the specification and verification system PVS [7] is utilized to integrate development steps and development methods from different existing tra ..."
Abstract
 Add to MetaCart
This talk is concerned with a mechanized formal treatment of the transformational software development process in a unified framework. As a formal vehicle, the specification and verification system PVS [7] is utilized to integrate development steps and development methods from different existing transformational approaches (for example, PROSPECTRA [6], KIDS [9], CIP [1, 5, 8], BirdMeertens [2]). Integration comprises the formalization (that is, the implementation in the PVS specification language), the verification, and the correct application of the generic development steps to specific problems. Transformations of different kind and complexity have been integrated into this framework comprising the whole development process: problemsolving strategies encoding general algorithmic paradigms such as globalsearch and divideandconquer [10], transformations for the modification of functional specifications such as transformations from the BirdMeertens formalism like fusion or Horner'...