Results 1  10
of
11
Dynamic programming via static incrementalization
 In Proceedings of the 8th European Symposium on Programming
, 1999
"... Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dyn ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dynamic programming algorithm solves every subsubproblem just once, saves the result, reuses it when the subsubproblem is encountered again, and takes polynomial time. This paper describes a systematic method for transforming programs written as straightforward recursions into programs that use dynamic programming. The method extends the original program to cache all possibly computed values, incrementalizes the extended program with respect to an input increment to use and maintain all cached results, prunes out cached results that are not used in the incremental computation, and uses the resulting incremental program to form an optimized new program. Incrementalization statically exploits semantics of both control structures and data structures and maintains as invariants equalities characterizing cached results. The principle underlying incrementalization is general for achieving drastic program speedups. Compared with previous methods that perform memoization or tabulation, the method based on incrementalization is more powerful and systematic. It has been implemented and applied to numerous problems and succeeded on all of them. 1
Aspects of applicative programming for parallel processing
 IEEE Transactions on Computers
, 1978
"... AbstractEarly results of a project on compiling stylized recursion into stackless iterative code are reviewed as they apply to a target environment with multiprocessing. Parallelism is possible in executing the compiled image of argument evaluation (collateral argument evaluation of Algol 68), of d ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
AbstractEarly results of a project on compiling stylized recursion into stackless iterative code are reviewed as they apply to a target environment with multiprocessing. Parallelism is possible in executing the compiled image of argument evaluation (collateral argument evaluation of Algol 68), of data structure construction when suspensions are used, and of functional combinations. The last facility provides generally, concise expression for all operations performed in Lisp by mapping functions and in APL by typed operators; there are other uses as welL Index TermsCompiling, functional combinations, Lisp, multiprocessing, recursion, suspensions.
Derivation of Data Parallel Code from a Functional Program
, 1994
"... In this article, we demonstrate a translation methodology which transforms a high level algorithmic specification written in the Alpha language to an imperative data parallel language. Alpha is a functional language which was designed to facilitate the kinds of static analyses needed for doing re ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
In this article, we demonstrate a translation methodology which transforms a high level algorithmic specification written in the Alpha language to an imperative data parallel language. Alpha is a functional language which was designed to facilitate the kinds of static analyses needed for doing regular array synthesis. We show that the same methods which are used for solving regular array synthesis problems can be applied to the compilation of Alpha as a functional language. We informally introduce the Alpha language with the aid of examples and explain how it is adapted to doing static analysis and transformation. We first show how an Alpha program can be naively implemented by viewing it as a set of monolithic arrays and their filling functions, implemented using applicative caching. We then show how to improve the efficiency of this naive implementation by orders of magnitude. We present a compilation method which makes incremental transformations on the abstract syntax ...
The Naive Execution of Affine Recurrence Equations
 INTERNATIONAL CONFERENCE ON APPLICATIONSPECIFIC ARRAY PROCESSORS
, 1995
"... In recognition of the fundamental relation between regular arrays and systems of affine recurrence equations, the Alpha language was developed as the basis of a computer aided design methodology for regular array architectures. Alpha is used to initially specify algorithms at a very high algorith ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
In recognition of the fundamental relation between regular arrays and systems of affine recurrence equations, the Alpha language was developed as the basis of a computer aided design methodology for regular array architectures. Alpha is used to initially specify algorithms at a very high algorithmic level. Regular array architecures can then be derived from the algorithmic specification using a transformational approach supported by the Alpha environment. This design methodology guarantees the final design to be correct by construction, assuming the initial algorithm was correct. In this paper, we address the problem of validating an initial specification. We demonstrate a translation methodolody which compiles Alpha into the imperative sequential language C. The Ccode may then be compiled and executed to test the specification. We show how an Alpha program can be naively implemented by viewing it as a set of monolithic arrays and their filling functions, implemented usin...
Program Optimization Using Indexed and Recursive Data Structures
, 2002
"... This paper describes a systematic method for optimizing recursive functions using both indexed and recursive data structures. The method is based on two critical ideas: first, determining a minimal input increment operation so as to compute a function on repeatedly incremented input; second, determi ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
This paper describes a systematic method for optimizing recursive functions using both indexed and recursive data structures. The method is based on two critical ideas: first, determining a minimal input increment operation so as to compute a function on repeatedly incremented input; second, determining appropriate additional values to maintain in appropriate data structures, based on what values are needed in computation on an incremented input and how these values can be established and accessed. Once these two are determined, the method extends the original program to return the additional values, derives an incremental version of the extended program, and forms an optimized program that repeatedly calls the incremental program. The method can derive all dynamic programming algorithms found in standard algorithm textbooks. There are many previous methods for deriving efficient algorithms, but none is as simple, general, and systematic as ours.
Subset Logic Programs And Their Implementation
, 1994
"... Machine (WAM) to implement the paradigm and that these changes blend well with the overall machinery of the WAM. A central feature in the implementation of subsetlogic programs is that of a monotonic memotable, i.e., a memotable whose entries can monotonically grow or shrink in an appropriate ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Machine (WAM) to implement the paradigm and that these changes blend well with the overall machinery of the WAM. A central feature in the implementation of subsetlogic programs is that of a monotonic memotable, i.e., a memotable whose entries can monotonically grow or shrink in an appropriate partial order. We present in stages the paradigm of subsetlogic progams, showing the effect of each feature on the implementation. The implementation has been completed, and we present performance figures to show the efficiency and costs of memoization. Our conclusion is that the monotonic memotables are a practical tool for implementing a setoriented logic programming language. We also compare this system with other closely related systems, especially XSB and CORAL. Keywords: subset and relational program clauses, sets, set matching and unification, memo tables, monotonic aggregation, Warren Abstract Machine, runtime structures, performance analysis / Address correspondence to...
Implementation Of Subset Logic Programs
"... Subsetlogic programs are built up of three kinds of program clauses: subset, equational, and relational clauses. Using these clauses, we can program solutions to a broad range of problems of interest in logic programming and deductive databases. In an earlier paper [Jay92], we discussed the impleme ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Subsetlogic programs are built up of three kinds of program clauses: subset, equational, and relational clauses. Using these clauses, we can program solutions to a broad range of problems of interest in logic programming and deductive databases. In an earlier paper [Jay92], we discussed the implementation of subset and equational program clauses. This paper substantially extends that work, and focuses on the more expressive paradigm of subset and relational clauses. This paradigm supports setof operations, transitive closures, monotonic aggregation as well as incremental and lazy eager enumeration of sets. Although the subsetlogic paradigm differs substantially from that of Prolog, we show that few additional changes are needed to the WAM [War83] to implement the paradigm and that these changes blend well with the overall machinery of the WAM. A central feature in the implementation of subsetlogic programs is that of a "monotonic memotable," i.e., a memotable who entries can monot...
From ALPHA to Imperative Alpha to Imperative Code: A Transformational Compiler for an Array Based Functional Language
, 1995
"... Practical parallel programming demands that the details of distributing data to processors and interprocessor communication be managed by the compiler. These tasks quickly become too difficult for a programmer to do by hand for all but the simplest parallel programs. Yet, many parallel languages sti ..."
Abstract
 Add to MetaCart
Practical parallel programming demands that the details of distributing data to processors and interprocessor communication be managed by the compiler. These tasks quickly become too difficult for a programmer to do by hand for all but the simplest parallel programs. Yet, many parallel languages still require the programmer to manage much of the the parallelism. I discuss the synthesis of parallel imperative code from algorithms written in a functional language called Alpha. Alpha is based on systems of affine recurrence equations and was designed to specify algorithms for regular array architectures. Being a functional language, Alpha implicitly supports the expression of both concurrency and communication. Thus, the programmer is freed from having to explicitly manage the parallelism. Using the information derived from static analysis, Alpha can be transformed into a form suitable for generating imperative parallel code through a series of provably correct program transformations. Th...
Subset Assertions and NegationAsFailure
, 1993
"... Subset assertions provide a declarative and natural means for expressing solutions to many problems involving sets. This paper is motivated by the use of subset assertions for formulating transitive closures and solving containment constraints in applications of graph traversal and program analysis. ..."
Abstract
 Add to MetaCart
Subset assertions provide a declarative and natural means for expressing solutions to many problems involving sets. This paper is motivated by the use of subset assertions for formulating transitive closures and solving containment constraints in applications of graph traversal and program analysis. In these applications, circular containment constraints may arise, for which we propose an operational strategy based upon memoization and reexecution of function calls. We provide formal declarative and operational semantics for this class of subset assertions. One of the main technical results of this paper is a succinct translation of subset assertions into normal program clauses [L87] such that the stratified semantics of the resulting normal programs coincides with the declarative semantics of subset assertions. This translation is interesting because the operational semantics of subset assertions appears to be very different from that of normal programsdue to the setoflike capabil...
Using Static Analysis to Derive Imperative Code from ALPHA
, 1994
"... In this article, we demonstrate a translation methodology which transforms a high level algorithmic specification written in the Alpha language to an imperative data parallel language. Alpha is a functional language which was designed to facilitate the kinds of static analyses needed for doing reg ..."
Abstract
 Add to MetaCart
In this article, we demonstrate a translation methodology which transforms a high level algorithmic specification written in the Alpha language to an imperative data parallel language. Alpha is a functional language which was designed to facilitate the kinds of static analyses needed for doing regular array synthesis. We show that the same methods which are used for solving regular array synthesis problems can be applied to the compilation of Alpha as a functional language. We informally introduce the Alpha language with the aid of examples and explain how it is adapted to doing static analysis and transformation. We first show how an Alpha program can be naively implemented by viewing it as a set of monolithic arrays and their filling functions, implemented using applicative caching. We then show how static analysis can be used to improve the efficiency of this naive implementation by orders of magnitude. We present a compilation method which makes incremental transformations on t...