## Research Retrospective

Citations: | 1 - 0 self |

### BibTeX

@MISC{Paige_researchretrospective,

author = {Bob Paige},

title = {Research Retrospective},

year = {}

}

### OpenURL

### Abstract

The group was exciting in the 1970’s, when we were groping for direction and divided by different orientations. I guess it was in this atmosphere that combined purpose with uncertainty where I found my own voice. The common goal was a transformational program development methodology that would improve productivity of designing and maintaining correct software. The emphasis was on algorithmic software. We differed as to how to achieve this goal, and my approach was out on a limb. Based on a few transformations, the most exciting of which was Jay Earley’s iterator inversion combined with high level strength reduction, and also on an overly optimistic faith in the power of science to shed light on this subject, I believed that algorithms and algorithmic software could be designed scientifically from abstract problem specifications by application of a small number of rules, whose selection could be simplified (even automated in some cases) if it could be guided by complexity. Most all others (including the SETL crowd at Courant) disagreed, and accepted the notion that algorithm design was ‘inspired’, and that the most significant steps in a derivation were unexplainable ‘Eureka ’ steps. I knew that my goals were ambitious and with little supporting evidence. In fact the

### Citations

326 |
Linear-time algorithms for testing the satisfiability of propositional Horn formulae
- Dowling, Gallier
- 1984
(Show Context)
Citation Context ...Q2+ specification of Horn Clause Propositional Satisfiability into a new linear time pointer machine algorithm. Previously, Dowling andsRESEARCH RETROSPECTIVE 11 Gallier found a linear time algorithm =-=[8]-=- that relied heavily on array access. In [2] we used dominated convergence and finite differencing to derive a low level SETL executable prototype from an SQ2+ specification of ready simulation. We th... |

293 |
Flow Analysis of Computer Programs
- Hecht
- 1977
(Show Context)
Citation Context ...[13] and by Paige at a workshop in Toulouse, in 1983. In Compiler classes at Rutgers I also developed a crude form of dominated convergence in order to derive workset algorithms found in Hecht’s book =-=[10]-=- for solving global program analysis problems specified by fixed points. Work on this transformation progressed as it was seen to be progressively more useful in deriving an increasingly wider range o... |

117 | Finite differencing of computable expressions
- Paige, Koenig
- 1982
(Show Context)
Citation Context ... of complex codes, and provide greater assurance in the reliable modification of such codes. Perhaps the first examples of nontrivial algorithms being derived by finite differencing were presented in =-=[14, 16]-=-. Included among these examples is a SETL specification of Dijkstra’s naive Bankers Algorithm and its transformation into Habermann’s efficient solution. This derivation was done without knowledge of ... |

63 | Program derivation by fixed point computation
- Cai, Paige
- 1989
(Show Context)
Citation Context ... in order to derive the more difficult algorithms. However, it was not until my collaboration with Cai that a comprehensive investigation and development of this transformation was first completed in =-=[4]-=-. It became apparent from the beginning that I was looking for a transformational program development methodology whose final step would be data structure selection beyond which lay conventional compi... |

47 | Transformational Design and Implementation of a New Efficient Solution to the Ready Simulation Problem
- Bloom, Paige
(Show Context)
Citation Context ...al Satisfiability into a new linear time pointer machine algorithm. Previously, Dowling andsRESEARCH RETROSPECTIVE 11 Gallier found a linear time algorithm [8] that relied heavily on array access. In =-=[2]-=- we used dominated convergence and finite differencing to derive a low level SETL executable prototype from an SQ2+ specification of ready simulation. We then showed informally how the low level SETL ... |

45 |
The Art of Computer Programming, Volume II: Seminumerical Algorithms, Third Edition
- Knuth
- 1998
(Show Context)
Citation Context ...nto RAM code guaranteed to run in linear time in the size of the program dataflow relation. The use of notation has been regarded as a burden to algorithm designers ever since Knuth came out with Mix =-=[12]-=-. But can notation also help satisfy the needs of the algorithm community – precise algorithmic analysis and succinct exposition? I was thrilled that coauthor Bob Tarjan agreed to explain our new line... |

39 |
Symbolic evaluation and the global value graph
- Reif, Lewis
- 1977
(Show Context)
Citation Context ... It was also shown how to simplify these specifications, and to transform them by dominated convergence into high level SETL prototypes [4]. In [3] we showed how the constant propagation algorithm of =-=[19]-=- could be expressed as set-theoretic equations in a subset of SQ2+ that could be mapped into RAM code guaranteed to run in linear time in the size of the program dataflow relation. The use of notation... |

36 | Using multiset discrimination to solve language processing problems without hashing
- Cai, Paige
- 1995
(Show Context)
Citation Context ...e differencing, and real-time simulation [17] (a special TCS issue of best papers selected from ICALP84). In [18] a much improved explanation of the algorithmic tool called Multiset Discrimination in =-=[6]-=- was obtained by specifying the algorithm in low level SETL and using its type system to formally explain and analyze the low level implementation that would be obtained by real-time simulation. The e... |

34 | From regular expressions to DFAs using compressed NFAs
- Chang, Paige
- 1992
(Show Context)
Citation Context ...e useful in modeling complex data structures), were involved in the discovery of a new improved solution to the classical problem of DFA minimization [11]. Finite differencing was used extensively in =-=[7]-=- to derive a new improved solution to the classical problem of turning regular expressions into DFA’s. Perhaps our most convincing paper-and-pencil result was in [9], where Goyal and I used low level ... |

32 |
R.: A linear time solution to the single function coarsest partition problem
- Paige, Tarjan, et al.
- 1985
(Show Context)
Citation Context ...lain our new linear time solution to the Single Function Coarsest Partition Problem as being derived from an SQ2+ specification by dominated convergence, finite differencing, and real-time simulation =-=[17]-=- (a special TCS issue of best papers selected from ICALP84). In [18] a much improved explanation of the algorithmic tool called Multiset Discrimination in [6] was obtained by specifying the algorithm ... |

30 |
A transformational framework for the automatic control of derived data
- KOENIG, PAIGE
(Show Context)
Citation Context ...integrity control wassRESEARCH RETROSPECTIVE 9 an important open problem in relational databases. Successful use of finite differencing to solve view maintenance and integrity control was reported in =-=[13]-=- and by Paige at a workshop in Toulouse, in 1983. In Compiler classes at Rutgers I also developed a crude form of dominated convergence in order to derive workset algorithms found in Hecht’s book [10]... |

27 | Mechanical translation of set theoretic problem specifications into efficient RAM code - a case study
- Paige, Henglein
- 1987
(Show Context)
Citation Context ... success with algorithm discovery may be attributed to our reliance on complexity in both specification and transformation. Our first instance of algorithm discovery by transformation was reported in =-=[15]-=-, where we used all three transformations to turn an SQ2+ specification of Horn Clause Propositional Satisfiability into a new linear time pointer machine algorithm. Previously, Dowling andsRESEARCH R... |

23 | Ready simulation, bisimulation, and the semantics of CCS-like languages
- Bloom
- 1989
(Show Context)
Citation Context ...2+ specification of ready simulation. We then showed informally how the low level SETL prototype could be turned into an algorithm that runs 5 orders of magnitude faster than the previous solution in =-=[1]-=-. All three transformations, but especially real-time simulation (where types were shown to be useful in modeling complex data structures), were involved in the discovery of a new improved solution to... |

21 |
Towards increased productivity of algorithm implementation
- Cai, Paige
- 1993
(Show Context)
Citation Context ...e that our transformational methodology will scale up and provide a dramatic improvement in the productivity of large high performance complex systems may be found in the experiments by Cai and Paige =-=[5]-=-. In that paper we developed a simple but conservative model of productivity. Within that model we demonstrated a five-fold improvement in productivity of high performance algorithm implementation in ... |

14 |
Binding performance at language design time
- Cai, Paige
- 1987
(Show Context)
Citation Context ...gmented with least and greatest fixed point expressions. It was also shown how to simplify these specifications, and to transform them by dominated convergence into high level SETL prototypes [4]. In =-=[3]-=- we showed how the constant propagation algorithm of [19] could be expressed as set-theoretic equations in a subset of SQ2+ that could be mapped into RAM code guaranteed to run in linear time in the s... |

13 | Program derivation with verified transformations – a case study
- Keller, Paige
- 1996
(Show Context)
Citation Context ... real-time simulation (where types were shown to be useful in modeling complex data structures), were involved in the discovery of a new improved solution to the classical problem of DFA minimization =-=[11]-=-. Finite differencing was used extensively in [7] to derive a new improved solution to the classical problem of turning regular expressions into DFA’s. Perhaps our most convincing paper-and-pencil res... |

9 | High level reading and data structure compilation
- Paige, Yang
- 1997
(Show Context)
Citation Context ...rtition Problem as being derived from an SQ2+ specification by dominated convergence, finite differencing, and real-time simulation [17] (a special TCS issue of best papers selected from ICALP84). In =-=[18]-=- a much improved explanation of the algorithmic tool called Multiset Discrimination in [6] was obtained by specifying the algorithm in low level SETL and using its type system to formally explain and ... |

8 |
The formal reconstruction and improvement of the linear time fragment of willard’s relational calculus subset
- Goyal, Paige
- 1997
(Show Context)
Citation Context ...fferencing was used extensively in [7] to derive a new improved solution to the classical problem of turning regular expressions into DFA’s. Perhaps our most convincing paper-and-pencil result was in =-=[9]-=-, where Goyal and I used low level SETL specifications and real-time simulation to improve Willard’s time bound for query processing from linear expected to linear worst-case time without degrading sp... |

4 |
Formal Differentiation
- Paige
- 1981
(Show Context)
Citation Context ... of complex codes, and provide greater assurance in the reliable modification of such codes. Perhaps the first examples of nontrivial algorithms being derived by finite differencing were presented in =-=[14, 16]-=-. Included among these examples is a SETL specification of Dijkstra’s naive Bankers Algorithm and its transformation into Habermann’s efficient solution. This derivation was done without knowledge of ... |