Results 1  10
of
28
The Use of Explicit Plans to Guide Inductive Proofs
 9th Conference on Automated Deduction
, 1988
"... We propose the use of explicit proof plans to guide the search for a proof in automatic theorem proving. By representing proof plans as the specifications of LCFlike tactics, [Gordon et al 79], and by recording these specifications in a sorted metalogic, we are able to reason about the conjectures ..."
Abstract

Cited by 267 (38 self)
 Add to MetaCart
We propose the use of explicit proof plans to guide the search for a proof in automatic theorem proving. By representing proof plans as the specifications of LCFlike tactics, [Gordon et al 79], and by recording these specifications in a sorted metalogic, we are able to reason about the conjectures to be proved and the methods available to prove them. In this way we can build proof plans of wide generality, formally account for and predict their successes and failures, apply them flexibly, recover from their failures, and learn them from example proofs. We illustrate this technique by building a proof plan based on a simple subset of the implicit proof plan embedded in the BoyerMoore theorem prover, [Boyer & Moore 79]. Keywords Proof plans, inductive proofs, theorem proving, automatic programming, formal methods, planning. Acknowledgements I am grateful for many long conversations with other members of the mathematical reasoning group, from which many of the ideas in this paper e...
Acquiring SearchControl Knowledge via Static Analysis
 Artificial Intelligence
, 1993
"... ExplanationBased Learning (EBL) is a widelyused technique for acquiring searchcontrol knowledge. Recently, Prieditis, van Harmelen, and Bundy pointed to the similarity between Partial Evaluation (PE) and EBL. However, EBL utilizes training examples whereas PE does not. It is natural to inquire, th ..."
Abstract

Cited by 87 (2 self)
 Add to MetaCart
ExplanationBased Learning (EBL) is a widelyused technique for acquiring searchcontrol knowledge. Recently, Prieditis, van Harmelen, and Bundy pointed to the similarity between Partial Evaluation (PE) and EBL. However, EBL utilizes training examples whereas PE does not. It is natural to inquire, therefore, whether PE can be used to acquire searchcontrol knowledge, and if so at what cost? This paper answers these questions by means of a case study comparing prodigy/ebl, a stateoftheart EBL system, and static, a PEbased analyzer of problemspace definitions. When tested in prodigy/ebl's benchmark problem spaces, static generated searchcontrol knowledge that was up to three times as effective as the knowledge learned by prodigy/ebl, and did so from twentysix to seventyseven times faster. The paper describes static's algorithms, compares its performance to prodigy/ebl's, noting when static's superior performance will scale up and when it will not. The paper concludes with several le...
A Survey and Classification of some Program Transformation Approaches and Techniques
 In TC2 IFIP Working Conference on Program Specification and Transformation
, 1987
"... Program transformation is a means to formally develop efficient programs from lucid specifications. A representative sample of the diverse range of program transformation research is classified into several different approaches based upon the motivations for and styles of constructing such formal de ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
Program transformation is a means to formally develop efficient programs from lucid specifications. A representative sample of the diverse range of program transformation research is classified into several different approaches based upon the motivations for and styles of constructing such formal developments. Individual techniques for supporting construction of developments are also surveyed, and are related to the various approaches.
A formal approach to domainoriented software design environments
 In Proc. 9th KnowledgeBased Software Engineering Conference
, 1994
"... This paper describes a formal approach to domainoriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domainoriented specification language and its relationship to ..."
Abstract

Cited by 31 (6 self)
 Add to MetaCart
This paper describes a formal approach to domainoriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domainoriented specification language and its relationship to implementationlevel subroutines. Formal specification development and reuse is made accessible to users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. Deductive program synthesis ensures that specifications are correctly implemented. This approach has been implemented in AMPHION, a generic KBSE system that targets scientific subroutine libraries. AMPHION has been applied to the domain of solar system kinematics. AMPHION enables space scientists to develop, modify, and reuse specifications an order of magnitude more rapidly than manual program development. Program synthesis is efficient and completely automatic.
Program verification
 Journal of Automated Reasoning
, 1985
"... Computer programs may be regarded as formal mathematical objects whose properties are subject to mathematical proof. Program verification is the use of formal, mathematical techniques to debug software and software specifications. 1. Code Verification How are the properties of computer programs prov ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Computer programs may be regarded as formal mathematical objects whose properties are subject to mathematical proof. Program verification is the use of formal, mathematical techniques to debug software and software specifications. 1. Code Verification How are the properties of computer programs proved? We discuss three approaches in this article: inductive invariants, functional semantics, and explicit semantics. Because the first approach has received by far the most attention, it has produced the most impressive results to date. However, the field is now moving away from the inductive invariant approach. 1.1. Inductive Assertions The socalled FloydHoare inductive assertion method of program verification [25, 33] has its roots in the classic Goldstine and von Neumann reports [53] and handles the usual kind of programming language, of which FORTRAN is perhaps the best example. In this style of verification, the specifier "annotates " certain points in the program with mathematical assertions that are supposed to describe relations that hold between the program variables and the initial input values each time "control " reaches the annotated point. Among these assertions are some that characterize acceptable input and the desired output. By exploring all possible paths from one assertion to the next and analyzing the effects of intervening program statements it is possible to reduce the correctness of the program to the problem of proving certain derived formulas called verification conditions. Below we illustrate the idea with a simple program for computing the factorial of its integer input N flowchart assertion start with input(N) input N A: = 1 N = 0 yes stop with? answer A
Running Programs Backwards: the Logical Inversion of Imperative
, 2003
"... Imperative programs can be inverted directly from their forwarddirected program code with the use of logical inference. The relational semantics of imperative computations treats programs as logical relations over the observable state of the environment, which is taken to be the state of the variab ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
Imperative programs can be inverted directly from their forwarddirected program code with the use of logical inference. The relational semantics of imperative computations treats programs as logical relations over the observable state of the environment, which is taken to be the state of the variables in memory. Program relations denote both forward and backward computations, and the direction of the computation depends upon the instantiation pattern of arguments in the relation. This view of inversion has practical applications when the relational semantics is treated as a logic program. Depending on the logic programming inference scheme used, execution of this relational program can compute the inverse of the imperative program. A number of nontrivial imperative computations can be inverted with minimal logic programming tools.
Using MiddleOut Reasoning to Control the Synthesis of TailRecursive Programs
 IN PROC. CADE11, LNAI 607
, 1992
"... We describe a novel technique for the automatic synthesis of tailrecursive programs. The technique is to specify the required program using the standard equations and then synthesise the tailrecursive program using the proofs as programs technique. This requires the specification to be proved r ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
We describe a novel technique for the automatic synthesis of tailrecursive programs. The technique is to specify the required program using the standard equations and then synthesise the tailrecursive program using the proofs as programs technique. This requires the specification to be proved realisable in a constructive logic. Restrictions on the form of the proof ensure that the synthesised program is tailrecursive. The
Constraints to Stop HigherOrder Deforestation
 In 24th ACM Symposium on Principles of Programming Languages
, 1997
"... Wadler's deforestation algorithm eliminates intermediate data structures from functional programs. To be suitable for inclusion in a compiler, it must terminate on all programs. Several techniques to ensure termination of deforestation on all firstorder programs are known, but a technique for highe ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Wadler's deforestation algorithm eliminates intermediate data structures from functional programs. To be suitable for inclusion in a compiler, it must terminate on all programs. Several techniques to ensure termination of deforestation on all firstorder programs are known, but a technique for higherorder programs was only recently introduced by Hamilton, and elaborated and implemented in the Glasgow Haskell compiler by Marlow. We introduce a new technique for ensuring termination of deforestation on all higherorder programs that allows useful transformation steps prohibited in Hamilton's and Marlowe's techniques. 1 Introduction Lazy, higherorder, functional programming languages lend themselves to a certain style of programming which uses intermediate data structures [28]. Example 1 Consider the following program. letrec a = x; y:case x of [] ! y (h : t) ! h : a t y in u; v; w: a (a u v) w The term u; v; w:a (a u v) w appends the three lists u, v, and w. Appending u and v ...
Integer Constraints to Stop Deforestation
, 1996
"... . Deforestation is a transformation of functional programs to remove intermediate data structures. It is based on outermost unfolding of function calls where folding occurs when unfolding takes place within the same nested function call. Since unrestricted unfolding may encounter arbitrarily man ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
. Deforestation is a transformation of functional programs to remove intermediate data structures. It is based on outermost unfolding of function calls where folding occurs when unfolding takes place within the same nested function call. Since unrestricted unfolding may encounter arbitrarily many terms, a termination analysis has to determine those subterms where unfolding is possibly dangerous. We show that such an analysis can be obtained from a control flow analysis by an extension with integer constraints  essentially at no loss in efficiency. 1 Introduction The key idea of flow analysis for functional languages is to define an abstract meaning in terms of program points , i.e., subexpressions of the program possibly evaluated during program execution [Pa95]. Such analysises have been invented for tasks like type recovery [Sh91], binding time analysis [Co93], or safety analysis [PS95]. Conceptually, these are closely related to A. Deutsch's storebased alias analysis [D...