Results 1  10
of
14
Universal regular path queries
 HigherOrder and Symbolic Computation
, 2003
"... Given are a directed edgelabelled graph G with a distinguished node n0, and a regular expression P which may contain variables. We wish to compute all substitutions φ (of symbols for variables), together with all nodes n such that all paths n0 → n are in φ(P). We derive an algorithm for this proble ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Given are a directed edgelabelled graph G with a distinguished node n0, and a regular expression P which may contain variables. We wish to compute all substitutions φ (of symbols for variables), together with all nodes n such that all paths n0 → n are in φ(P). We derive an algorithm for this problem using relational algebra, and show how it may be implemented in Prolog. The motivation for the problem derives from a declarative framework for specifying compiler optimisations. 1 Bob Paige and IFIP WG 2.1 Bob Paige was a longstanding member of IFIP Working Group 2.1 on Algorithmic Languages and Calculi. In recent years, the main aim of this group has been to investigate the derivation of algorithms from specifications by program transformation. Already in the mideighties, Bob was way ahead of the pack: instead of applying transformational techniques to wellworn examples, he was applying his theories of program transformation to new problems, and discovering new algorithms [16, 48, 52]. The secret of his success lay partly in his insistence on the study of general algorithm design strategies (in particular
The formal reconstruction and speedup of the linear time fragment of Willard’s relational calculus subset
 In Proceedings of the IFIP TC 2 WG 2.1 international workshop on Algorithmic languages and calculi
, 1997
"... ..."
Goals and benchmarks for automated map reasoning
 Journal of Symbolic Computation
, 2000
"... TarskiGivant’s map calculus is briefly reviewed, and a plan of research is outlined aimed at investigating applications of this ground equational formalism in the theoremproving field. The main goal is to create synergy between firstorder predicate calculus and the map calculus. Techniques for tr ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
TarskiGivant’s map calculus is briefly reviewed, and a plan of research is outlined aimed at investigating applications of this ground equational formalism in the theoremproving field. The main goal is to create synergy between firstorder predicate calculus and the map calculus. Techniques for translating isolated sentences, as well as entire theories, from firstorder logic into map calculus are designed, or in some cases simply brought nearer through the exercise of specifying properties of a few familiar structures (natural numbers, nested lists, finite sets, lattices). It is also highlighted to what extent a stateoftheart theoremprover for firstorder logic, namely Otter, can be exploited not only to emulate, but also to reason about, map calculus. Issues regarding ’safe ’ forms of map reasoning are singled out, in sight of possible generalizations to the database area. 1
Research Retrospective
"... The group was exciting in the 1970’s, when we were groping for direction and divided by different orientations. I guess it was in this atmosphere that combined purpose with uncertainty where I found my own voice. The common goal was a transformational program development methodology that would impro ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
The group was exciting in the 1970’s, when we were groping for direction and divided by different orientations. I guess it was in this atmosphere that combined purpose with uncertainty where I found my own voice. The common goal was a transformational program development methodology that would improve productivity of designing and maintaining correct software. The emphasis was on algorithmic software. We differed as to how to achieve this goal, and my approach was out on a limb. Based on a few transformations, the most exciting of which was Jay Earley’s iterator inversion combined with high level strength reduction, and also on an overly optimistic faith in the power of science to shed light on this subject, I believed that algorithms and algorithmic software could be designed scientifically from abstract problem specifications by application of a small number of rules, whose selection could be simplified (even automated in some cases) if it could be guided by complexity. Most all others (including the SETL crowd at Courant) disagreed, and accepted the notion that algorithm design was ‘inspired’, and that the most significant steps in a derivation were unexplainable ‘Eureka ’ steps. I knew that my goals were ambitious and with little supporting evidence. In fact the
An NSF proposal
, 2005
"... The objectives of this research are to improve software productivity, reliability, and performance of complex systems. The approach combines program transformations, sometimes in reflective ways, to turn very high level perspicuous specifications into efficient implementations. These transformatio ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The objectives of this research are to improve software productivity, reliability, and performance of complex systems. The approach combines program transformations, sometimes in reflective ways, to turn very high level perspicuous specifications into efficient implementations. These transformations will be implemented in a metatransformational system, which itself will be transformed from an executable specification into efficient code. Experiments will be conducted to assess the research objectives in scaled up applications targetted to systems that perform complex program analysis and translation. The transformations to be used include dominated convergence (for implementing fixed points efficiently), finite differencing (for replacing costly repeated calculations by less expensive incremental counterparts), data structure selection (for simulating associative access on a RAM in real time), and partial evaluation (for eliminating interpretive overhead and simplification). Correctness of these transformations, of userdefined transformations, and of the transformational system itself will be addressed in part. Both the partial evaluator and components of the
USINGÆTNANOVA TO FORMALLY PROVE THAT THE DAVISPUTNAM SATISFIABILITY TEST IS CORRECT
"... This paper reports on using theÆtnaNova/Referee proofverification system to formalize issues regarding the satisfiability of CNFformulae of propositional logic. We specify an “archetype ” version of the DavisPutnamLogemannLoveland algorithm through the THEORY of recursive functions based on a ..."
Abstract
 Add to MetaCart
(Show Context)
This paper reports on using theÆtnaNova/Referee proofverification system to formalize issues regarding the satisfiability of CNFformulae of propositional logic. We specify an “archetype ” version of the DavisPutnamLogemannLoveland algorithm through the THEORY of recursive functions based on a wellfounded relation, and prove it to be correct. Within the same framework, and by resorting to the Zorn lemma, we develop a straightforward proof of the compactness theorem. 1.