Results 11  20
of
53
Dynamic programming via static incrementalization
 In Proceedings of the 8th European Symposium on Programming
, 1999
"... Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dyn ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dynamic programming algorithm solves every subsubproblem just once, saves the result, reuses it when the subsubproblem is encountered again, and takes polynomial time. This paper describes a systematic method for transforming programs written as straightforward recursions into programs that use dynamic programming. The method extends the original program to cache all possibly computed values, incrementalizes the extended program with respect to an input increment to use and maintain all cached results, prunes out cached results that are not used in the incremental computation, and uses the resulting incremental program to form an optimized new program. Incrementalization statically exploits semantics of both control structures and data structures and maintains as invariants equalities characterizing cached results. The principle underlying incrementalization is general for achieving drastic program speedups. Compared with previous methods that perform memoization or tabulation, the method based on incrementalization is more powerful and systematic. It has been implemented and applied to numerous problems and succeeded on all of them. 1
A Comparative Revisitation of Some Program Transformation Techniques
 Partial Evaluation, Int'l Seminar, Dagstuhl
, 1996
"... . We revisit the main techniques of program transformation which are used in partial evaluation, mixed computation, supercompilation, generalized partial computation, rulebased program derivation, program specialization, compiling control, and the like. We present a methodology which underlines the ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
. We revisit the main techniques of program transformation which are used in partial evaluation, mixed computation, supercompilation, generalized partial computation, rulebased program derivation, program specialization, compiling control, and the like. We present a methodology which underlines these techniques as a `common pattern of reasoning' and explains the various correspondences which can be established among them. This methodology consists of three steps: i) symbolic computation, ii) search for regularities, and iii) program extraction. We also discuss some control issues which occur when performing these steps. 1 Introduction During the past years researchers working in various areas of program transformation, such as partial evaluation, mixed computation, supercompilation, generalized partial computation, rulebased program derivation, program specialization, and compiling control, have been using very similar techniques for the development and derivation of programs. Unfor...
Equal Rights for Functional Objects or, The More Things Change, The More They Are the Same
, 1993
"... DATA TYPES A. Comparing Type Objects There has been as much confusion over type identity as there has been over object identity, although the type identity problem is usually referred to as the type equivalence problem [Aho86,s.6.3] [Wegbreit74] [Welsh77]. The type identity problem is to determine ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
DATA TYPES A. Comparing Type Objects There has been as much confusion over type identity as there has been over object identity, although the type identity problem is usually referred to as the type equivalence problem [Aho86,s.6.3] [Wegbreit74] [Welsh77]. The type identity problem is to determine when two types are equal, so that type checking can be done in a programming language. 22 Algol68 takes the point of view of "structural" equivalence, in which nonrecursive types that are built up from primitive types using the same type constructors in the same order should compare equal, while Ada takes the point of view of "name" equivalence, in which types are equivalent if and only if they have the same name. We will ignore the software engineering issues of which kind of type equivalence makes for betterengineered programs, and focus on the basic issue of type equivalence itself. We note that if a type system offers the type TYPEi.e., it offers firstclass representations of typ...
Caching intermediate results for program improvement
 In Proceedings of the 1995 ACM SIGPLAN Symposium on Partial Evaluation and SemanticsBased Program Manipulation, PEPM ’95
, 1995
"... A systematic approach is given for symbolically caching intermediate results useful for deriving incremental programs from nonincremental programs. We exploit a number of program analysis and transformation techniques, centered around e ective c a c hing based on its utilization in deriving increme ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
A systematic approach is given for symbolically caching intermediate results useful for deriving incremental programs from nonincremental programs. We exploit a number of program analysis and transformation techniques, centered around e ective c a c hing based on its utilization in deriving incremental programs, in order to increase the degree of incrementality not otherwise achievable by using only the return values of programs that are of direct interest. Our method can be applied straightforwardly to provide a systematic approach to program improvement via caching. 1
Calculating Accumulations
, 1999
"... this paper, we shall formulate accumulations as higher order catamorphisms , and propose several general transformation rules for calculating accumulations (i.e., finding and manipulating accumulations) by calculationbased (rather than a searchbased) program transformation methods. Some examples ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
this paper, we shall formulate accumulations as higher order catamorphisms , and propose several general transformation rules for calculating accumulations (i.e., finding and manipulating accumulations) by calculationbased (rather than a searchbased) program transformation methods. Some examples are given for illustration.
Sharing of Computations
, 1993
"... This report is a revised version of my thesis of the same title, which was accepted for the Ph.D. degree in Computer Science at University of Aarhus, Denmark, in June 1993 ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
This report is a revised version of my thesis of the same title, which was accepted for the Ph.D. degree in Computer Science at University of Aarhus, Denmark, in June 1993
Memory Reuse Analysis in the Polyhedral Model
 Parallel Processing Letters
, 1996
"... In the context of developing a compiler for a Alpha, a functional dataparallel language based on systems of affine recurrence equations (SAREs), we address the problem of transforming scheduled singleassignment code to multiple assignment code. We show how the polyhedral model allows us to statical ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
In the context of developing a compiler for a Alpha, a functional dataparallel language based on systems of affine recurrence equations (SAREs), we address the problem of transforming scheduled singleassignment code to multiple assignment code. We show how the polyhedral model allows us to statically compute the lifetimes of program variables, and thus enables us to derive necessary and sufficient conditions for reusing memory. 1. Introduction The methodology of automatic systolic array synthesis from Systems of Affine Recurrence Equations (SAREs) has a close bearing on parallelizing compilers and on efficient implementation of functional languages. To study this relationship, we are currently developing a compiler for Alpha [9], a functional, data parallel language based on SAREs defined over polyhedral index domains. The language semantics directly lead to sequential code based on demand driven evaluation. However, the resulting context switches can be avoided if the program is tra...
Derivation of Data Parallel Code from a Functional Program
, 1994
"... In this article, we demonstrate a translation methodology which transforms a high level algorithmic specification written in the Alpha language to an imperative data parallel language. Alpha is a functional language which was designed to facilitate the kinds of static analyses needed for doing re ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
In this article, we demonstrate a translation methodology which transforms a high level algorithmic specification written in the Alpha language to an imperative data parallel language. Alpha is a functional language which was designed to facilitate the kinds of static analyses needed for doing regular array synthesis. We show that the same methods which are used for solving regular array synthesis problems can be applied to the compilation of Alpha as a functional language. We informally introduce the Alpha language with the aid of examples and explain how it is adapted to doing static analysis and transformation. We first show how an Alpha program can be naively implemented by viewing it as a set of monolithic arrays and their filling functions, implemented using applicative caching. We then show how to improve the efficiency of this naive implementation by orders of magnitude. We present a compilation method which makes incremental transformations on the abstract syntax ...
Structure and Design of Problem Reduction Generators
 Client Resources on the Internet, IEEE Multimedia Systems ’99
, 1991
"... In this paper we present an axiomatic theory for a class of algorithms, called problem reduction generators, that includes dynamic programming, general branchandbound, and game tree search as special cases. This problem reduction theory is used as the basis for a mechanizable design tactic that tr ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
In this paper we present an axiomatic theory for a class of algorithms, called problem reduction generators, that includes dynamic programming, general branchandbound, and game tree search as special cases. This problem reduction theory is used as the basis for a mechanizable design tactic that transforms formal specifications into problem reduction generators. The theory and tactic are illustrated by application to the problem of enumerating optimal binary search trees. Contents 1. Introduction 3 2. Basic Concepts And Notation 3 2.1. Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2. Signatures and Structures . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.3. Problem Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 3. Enumerating Feasible Solutions 6 3.1. Problem Reduction Theory . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3.2. Design Tactic  Enumerating Feasible Solutions . . . . . . . . . . . . . ....
Convergence of Program Transformers in the Metric Space of Trees
, 1998
"... . In recent years increasing consensus has emerged that program transformers, e.g., partial evaluation and unfold/fold transformations, should terminate; a compiler should stop even if it performs fancy optimizations! A number of techniques to ensure termination of program transformers have been inv ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
. In recent years increasing consensus has emerged that program transformers, e.g., partial evaluation and unfold/fold transformations, should terminate; a compiler should stop even if it performs fancy optimizations! A number of techniques to ensure termination of program transformers have been invented, but their correctness proofs are sometimes long and involved. We present a framework for proving termination of program transformers, cast in the metric space of trees. We first introduce the notion of an abstract program transformer; a number of wellknown program transformers can be viewed as instances of this notion. We then formalize what it means that an abstract program transformer terminates and give a general sufficient condition for an abstract program transformer to terminate. We also consider some specific techniques for satisfying the condition. As applications we show that termination of some wellknown program transformers either follows directly from the specific techn...