Results 1  10
of
28
Static caching for incremental computation
 ACM Trans. Program. Lang. Syst
, 1998
"... A systematic approach is given for deriving incremental programs that exploit caching. The cacheandprune method presented in the article consists of three stages: (I) the original program is extended to cache the results of all its intermediate subcomputations as well as the nal result, (II) the e ..."
Abstract

Cited by 46 (19 self)
 Add to MetaCart
A systematic approach is given for deriving incremental programs that exploit caching. The cacheandprune method presented in the article consists of three stages: (I) the original program is extended to cache the results of all its intermediate subcomputations as well as the nal result, (II) the extended program is incrementalized so that computation on a new input can use all intermediate results on an old input, and (III) unused results cached by the extended program and maintained by the incremental program are pruned away, l e a ving a pruned extended program that caches only useful intermediate results and a pruned incremental program that uses and maintains only the useful results. All three stages utilize static analyses and semanticspreserving transformations. Stages I and III are simple, clean, and fully automatable. The overall method has a kind of optimality with respect to the techniques used in Stage II. The method can be applied straightforwardly to provide a systematic approach to program improvement via caching.
Selective Memoization
"... We present a framework for applying memoization selectively. The framework provides programmer control over equality, space usage, and identification of precise dependences so that memoization can be applied according to the needs of an application. Two key properties of the framework are that it ..."
Abstract

Cited by 44 (19 self)
 Add to MetaCart
We present a framework for applying memoization selectively. The framework provides programmer control over equality, space usage, and identification of precise dependences so that memoization can be applied according to the needs of an application. Two key properties of the framework are that it is efficient and yields programs whose performance can be analyzed using standard techniques. We describe the framework in the context of a functional language and an implementation as an SML library. The language is based on a modal type system and allows the programmer to express programs that reveal their true data dependences when executed. The SML implementation cannot support this modal type system statically, but instead employs runtime checks to ensure correct usage of primitives.
Dynamic programming via static incrementalization
 In Proceedings of the 8th European Symposium on Programming
, 1999
"... Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dyn ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Dynamic programming is an important algorithm design technique. It is used for solving problems whose solutions involve recursively solving subproblems that share subsubproblems. While a straightforward recursive program solves common subsubproblems repeatedly and often takes exponential time, a dynamic programming algorithm solves every subsubproblem just once, saves the result, reuses it when the subsubproblem is encountered again, and takes polynomial time. This paper describes a systematic method for transforming programs written as straightforward recursions into programs that use dynamic programming. The method extends the original program to cache all possibly computed values, incrementalizes the extended program with respect to an input increment to use and maintain all cached results, prunes out cached results that are not used in the incremental computation, and uses the resulting incremental program to form an optimized new program. Incrementalization statically exploits semantics of both control structures and data structures and maintains as invariants equalities characterizing cached results. The principle underlying incrementalization is general for achieving drastic program speedups. Compared with previous methods that perform memoization or tabulation, the method based on incrementalization is more powerful and systematic. It has been implemented and applied to numerous problems and succeeded on all of them. 1
Memo Functions, Polytypically!
 Proceedings of the 2nd Workshop on Generic Programming, Ponte de
, 2000
"... . This paper presents a polytypic implementation of memo functions that are based on digital search trees. A memo function can be seen as the composition of a tabulation function that creates a memo table and a lookup function that queries the table. We show that tabulation can be derived from ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
. This paper presents a polytypic implementation of memo functions that are based on digital search trees. A memo function can be seen as the composition of a tabulation function that creates a memo table and a lookup function that queries the table. We show that tabulation can be derived from lookup by inverse function construction. The type of memo tables is dened by induction on the structure of argument types and is parametric with respect to the result type of memo functions. A memo table for a xed argument type is then a functor and lookup and tabulation are natural isomorphisms. We provide simple polytypic proofs of these properties. 1 Introduction A memo function [11] is like an ordinary function except that it caches previously computed values. If it is applied a second time to a particular argument, it immediately returns the cached result, rather than recomputing it. For storing arguments and results a memo function internally employs an index structure, the ...
On Normal Forms and Equivalence for Logic Programs
 Proceedings of the Joint International Conference and Symposium on Logic Programming
, 1992
"... It is known that larger classes of formulae than Horn clauses may be used as logic programming languages. One such class of formulae is hereditary Harrop formulae, for which an operational notion of provability has been studied, and it is known that operational provability corresponds to provability ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
It is known that larger classes of formulae than Horn clauses may be used as logic programming languages. One such class of formulae is hereditary Harrop formulae, for which an operational notion of provability has been studied, and it is known that operational provability corresponds to provability in intuitionistic logic. In this paper we discuss the notion of a normal form for this class of formulae, and show how this may be given by removing disjunctions and existential quantifications from programs. Whilst the normal form of the program preserves operational provability, there are operationally equivalent programs which are not intuitionistically equivalent. As it is known that classical logic is too strong to precisely capture operational provability for larger classes of programs than Horn clauses, the appropriate logic in which to study questions of equivalence is an intermediate logic. We explore the nature of the required logic, and show that this may be obtained by the addit...
Efficient parameterizable type expansion for typed feature formalisms
 IN PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI95
, 1995
"... Over the last few years, constraintbased grammar formalisms have become the predominant paradigm in natural language processing and computational linguistics From the viewpoint of computer science typed feature structures can be seen as data structures that allow the representation of linguistic kn ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
Over the last few years, constraintbased grammar formalisms have become the predominant paradigm in natural language processing and computational linguistics From the viewpoint of computer science typed feature structures can be seen as data structures that allow the representation of linguistic knowledge in a uniform fashion Type expansion is an operation that makes constraints of a typed feature structure explicit and determines its satisfiability We describe an efficient expansion algorithm that takes care of recursive type definitions and permits the exploration of different expansion strategies through the use of control knowledge This knowledge is specified on a separate layer independent of grammatical information The algorithm as presented in the paper, has been full> implemented in COMMON LISP and is an integrated part of the typed feature formallsm TDC that is employed in several large NL projects
Binary Decision Graphs
 Static Analyis Symposium SAS’99, LNCS 1694
, 1999
"... Binary Decision Graphs are an extension of Binary Decision Diagrams that can represent some infinite boolean functions. Three refinements of BDGs corresponding to classes of infinite functions of increasing complexity are presented. The first one is closed by intersection and union, the second o ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Binary Decision Graphs are an extension of Binary Decision Diagrams that can represent some infinite boolean functions. Three refinements of BDGs corresponding to classes of infinite functions of increasing complexity are presented. The first one is closed by intersection and union, the second one by intersection, and the last one by all boolean operations. The first two classes give rise to a canonical representation, which, when restricted to finite functions, are the classical BDDs. The paper also gives new insights in to the notion of variable names and the possibility of sharing variable names that can be of interest in the case of finite functions.
Adaptive memoization
, 2003
"... We combine adaptivity and memoization to obtain an incremental computation technique that dramatically improves performance over adaptivity and memoization alone. The key contribution is adaptive memoization, which enables result reuse by matching any subset of the function arguments to a previous ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We combine adaptivity and memoization to obtain an incremental computation technique that dramatically improves performance over adaptivity and memoization alone. The key contribution is adaptive memoization, which enables result reuse by matching any subset of the function arguments to a previous function call and updating the result to satisfy the unmatched arguments via adaptivity. We study the technique in the context of a purely functional language, called IFL, and as an ML library. The library provides an efficient implementation of our techniques with constant overhead. As examples, we consider Quicksort and Insertion Sort. We show that Quicksort handles insertions or deletions at random positions in the input list in O(log n) expected time. For insertion sort, we show that insertions and deletions anywhere in the list take O(n) time.
Counting 1324avoiding permutations
 Electron. J. Combin. 9
"... We consider permutations that avoid the pattern 1324. By studying the generating tree for such permutations, we obtain a recurrence formula for their number. A computer program provides data for the number of 1324avoiding permutations of length up to 20. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We consider permutations that avoid the pattern 1324. By studying the generating tree for such permutations, we obtain a recurrence formula for their number. A computer program provides data for the number of 1324avoiding permutations of length up to 20.
Structural Properties of Logic Programs
 Proceedings of the Fourteenth Australian Computer Science Conference
"... Miller has shown that disjunctions are not necessary in a large fragment of hereditary Harrop formulae, a class of formulae which properly includes Horn clauses. In this paper we extend this result to include existential quantifications, so that for each program D, there is a program D 0 which ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Miller has shown that disjunctions are not necessary in a large fragment of hereditary Harrop formulae, a class of formulae which properly includes Horn clauses. In this paper we extend this result to include existential quantifications, so that for each program D, there is a program D 0 which is operationally equivalent, but contains no disjunctions or existential quantifications. We may think of this process as deriving a normal form for the program. This process is carried out by pushing the connectives outwards from the body of a clause, and this process leads to a normal form for goals as well. The properties of the search process used to find uniform proofs of goals (which generalises SLDresolution) together with the normal form allow successful goals to be converted into program clauses, and so we may add successful goals to the program. The stored form of the goal requires a larger class of formulae, i.e. full firstorder hereditary Harrop formulae, and so this lea...