Results 1 
7 of
7
An experimental analysis of selfadjusting computation
 In Proceedings of the ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI
, 2006
"... Selfadjusting computation uses a combination of dynamic dependence graphs and memoization to efficiently update the output of a program as the input changes incrementally or dynamically over time. Related work showed various theoretical results, indicating that the approach can be effective for a r ..."
Abstract

Cited by 35 (19 self)
 Add to MetaCart
Selfadjusting computation uses a combination of dynamic dependence graphs and memoization to efficiently update the output of a program as the input changes incrementally or dynamically over time. Related work showed various theoretical results, indicating that the approach can be effective for a reasonably broad range of applications. In this article, we describe algorithms and implementation techniques to realize selfadjusting computation and present an experimental evaluation of the proposed approach on a variety of applications, ranging from simple list primitives to more sophisticated computational geometry algorithms. The results of the experiments show that the approach is effective in practice, often offering orders of magnitude speedup from recomputing the output from scratch. We believe this is the first experimental evidence that incremental computation of any type is effective in practice for a reasonably broad set of applications.
Imperative selfadjusting computation
 In POPL ’08: Proceedings of the 35th annual ACM SIGPLANSIGACT symposium on Principles of programming languages
, 2008
"... Recent work on selfadjusting computation showed how to systematically write programs that respond efficiently to incremental changes in their inputs. The idea is to represent changeable data using modifiable references, i.e., a special data structure that keeps track of dependencies between read an ..."
Abstract

Cited by 27 (16 self)
 Add to MetaCart
Recent work on selfadjusting computation showed how to systematically write programs that respond efficiently to incremental changes in their inputs. The idea is to represent changeable data using modifiable references, i.e., a special data structure that keeps track of dependencies between read and writeoperations, and to let computations construct traces that later, after changes have occurred, can drive a change propagation algorithm. The approach has been shown to be effective for a variety of algorithmic problems, including some for which adhoc solutions had previously remained elusive. All previous work on selfadjusting computation, however, relied on a purely functional programming model. In this paper, we show that it is possible to remove this limitation and support modifiable references that can be written multiple times. We formalize this using a language AIL for which we define evaluation and changepropagation semantics. AIL closely resembles a traditional higherorder imperative programming language. For AIL we state and prove consistency, i.e., the property that although the semantics is inherently nondeterministic, different evaluation paths will still give observationally equivalent results. In the imperative setting where pointer graphs in the store can form cycles, our previous proof techniques do not apply. Instead, we make use of a novel form of a stepindexed logical relation that handles modifiable references. We show that AIL can be realized efficiently by describing implementation strategies whose overhead is provably constanttime per primitive. When the number of reads and writes per modifiable is bounded by a constant, we can show that change propagation becomes as efficient as it was in the pure case. The general case incurs a slowdown that is logarithmic in the maximum number of such operations. We use DFS and related algorithms on graphs as our running examples and prove that they respond to insertions and deletions of edges efficiently. 1.
A consistent semantics of selfadjusting computation
, 2006
"... Abstract. This paper presents a semantics of selfadjusting computation and proves that the semantics is correct and consistent. The semantics integrates change propagation with the classic idea of memoization to enable reuse of computations under mutation to memory. During evaluation, reuse of a co ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
Abstract. This paper presents a semantics of selfadjusting computation and proves that the semantics is correct and consistent. The semantics integrates change propagation with the classic idea of memoization to enable reuse of computations under mutation to memory. During evaluation, reuse of a computation via memoization triggers a change propagation that adjusts the reused computation to reflect the mutated memory. Since the semantics combines memoization and changepropagation, it involves both nondeterminism and mutation. Our consistency theorem states that the nondeterminism is not harmful: any two evaluations of the same program starting at the same state yield the same result. Our correctness theorem states that mutation is not harmful: selfadjusting programs are consistent with purely functional programming. We formalized the semantics and its metatheory in the LF logical framework and machinechecked the proofs in Twelf. 1
A Proposal for Parallel SelfAdjusting Computation
, 2002
"... We present an overview of our ongoing work on parallelizing selfadjustingcomputation techniques. In selfadjusting computation, programs can respond to changes to their data (e.g., inputs, outcomes of comparisons) automatically by running a changepropagation algorithm. This ability is important i ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
We present an overview of our ongoing work on parallelizing selfadjustingcomputation techniques. In selfadjusting computation, programs can respond to changes to their data (e.g., inputs, outcomes of comparisons) automatically by running a changepropagation algorithm. This ability is important in applications where inputs change slowly over time. All previously proposed selfadjusting computation techniques assume a sequential execution model. We describe techniques for writing parallel selfadjusting programs and a change propagation algorithm that can update computations in parallel. We describe a prototype implementation and present preliminary experimental results.
Robust kinetic convex hulls in 3D
 In Proceedings of the 16th Annual European Symposium on Algorithms
"... Abstract. Kinetic data structures provide a framework for computing combinatorial properties of continuously moving objects. Although kinetic data structures for many problems have been proposed, some difficulties remain in devising and implementing them, especially robustly. One set of difficulties ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Abstract. Kinetic data structures provide a framework for computing combinatorial properties of continuously moving objects. Although kinetic data structures for many problems have been proposed, some difficulties remain in devising and implementing them, especially robustly. One set of difficulties stems from the required update mechanisms used for processing certificate failures—devising efficient update mechanisms can be difficult, especially for sophisticated problems such as those in 3D. Another set of difficulties arises due to the strong assumption in the framework that the update mechanism is invoked with a single event. This assumption requires ordering the events precisely, which is generally expensive. This assumption also makes it difficult to deal with simultaneous events that arise due to degeneracies or due to intrinsic properties of the kinetized algorithms. In this paper, we apply advances on selfadjusting computation to provide a robust motion simulation technique that combines kinetic eventbased scheduling and the classic idea of fixedtime sampling. The idea is to divide time into a lattice of fixedsize intervals, and process events at the resolution of an interval. We apply the approach to the problem of kinetic maintenance of convex hulls in 3D, a problem that has been open since 90s. We evaluate the effectiveness of the proposal experimentally. Using the approach, we are able to run simulations consisting of tens of thousands of points robustly and efficiently. 1
SelfAdjusting Computation with Delta ML
"... Abstract. In selfadjusting computation, programs respond automatically and efficiently to modifications to their data by tracking the dynamic data dependences of the computation and incrementally updating the output as needed. In this tutorial, we describe the selfadjustingcomputation model and pr ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. In selfadjusting computation, programs respond automatically and efficiently to modifications to their data by tracking the dynamic data dependences of the computation and incrementally updating the output as needed. In this tutorial, we describe the selfadjustingcomputation model and present the language ∆ML (Delta ML) for writing selfadjusting programs. 1
Programmable SelfAdjusting Computation
, 2010
"... and by donations from Intel Corporation. The views and conclusions contained in this document are those of the author and should not be interpreted as representing the official policies, either expressed or implied, of any sponsoring institution, the U.S. government or any other entity. Keywords: se ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
and by donations from Intel Corporation. The views and conclusions contained in this document are those of the author and should not be interpreted as representing the official policies, either expressed or implied, of any sponsoring institution, the U.S. government or any other entity. Keywords: selfadjusting computation, adaptivity, memoization, changepropagation, continuationpassing style, typed compilation, cost semantics, trace distance, traceable Selfadjusting computation is a paradigm for programming incremental computations that efficiently respond to input changes by updating the output in time proportional to the changes in the structure of the computation. This dissertation defends the thesis that highlevel programming abstractions improve the experience of reading, writing, and reasoning about and the efficiency of selfadjusting programs. We show that highlevel language constructs are suitable for writing readable selfadjusting programs and can be compiled into lowlevel primitives. In particular, language constructs such as MLstyle modifiable references and memoizing functions provide orthogonal mechanisms for identifying stale computation