Results 1  10
of
70
Simplification by cooperating decision procedures
 ACM Transactions on Programming Languages and Systems
, 1979
"... A method for combining decision procedures for several theories into a single decision procedure for their combination is described, and a simplifier based on this method is discussed. The simplifier finds a normal form for any expression formed from individual variables, the usual Boolean connectiv ..."
Abstract

Cited by 455 (2 self)
 Add to MetaCart
(Show Context)
A method for combining decision procedures for several theories into a single decision procedure for their combination is described, and a simplifier based on this method is discussed. The simplifier finds a normal form for any expression formed from individual variables, the usual Boolean connectives, the equality predicate =, the conditional function ifthenelse, the integers, the arithmetic functions and predicates +,, and _<, the Lisp functions and predicates car, cdr, cons, and atom, the functions store and select for storing into and selecting from arrays, and uninterpreted function symbols. If the expression is a theorem it is simplified to the constant true, so the simplifier can be used as a decision procedure for the quantifierfree theory containing these functions and predicates. The simplifier is currently used in the Stanford Pascal Verifier.
Automatic Verification of Pipelined Microprocessor Control
, 1994
"... We describe a technique for verifying the control logic of pipelined microprocessors. It handles more complicated designs, and requires less human intervention, than existing methods. The technique automaticMly compares a pipelined implementation to an architectural description. The CPU time nee ..."
Abstract

Cited by 290 (7 self)
 Add to MetaCart
(Show Context)
We describe a technique for verifying the control logic of pipelined microprocessors. It handles more complicated designs, and requires less human intervention, than existing methods. The technique automaticMly compares a pipelined implementation to an architectural description. The CPU time needed for verification is independent of the data path width, the register file size, and the number of ALU operations.
Lazy Satisfiability Modulo Theories
 JOURNAL ON SATISFIABILITY, BOOLEAN MODELING AND COMPUTATION 3 (2007) 141Â224
, 2007
"... Satisfiability Modulo Theories (SMT) is the problem of deciding the satisfiability of a firstorder formula with respect to some decidable firstorder theory T (SMT (T)). These problems are typically not handled adequately by standard automated theorem provers. SMT is being recognized as increasingl ..."
Abstract

Cited by 189 (50 self)
 Add to MetaCart
(Show Context)
Satisfiability Modulo Theories (SMT) is the problem of deciding the satisfiability of a firstorder formula with respect to some decidable firstorder theory T (SMT (T)). These problems are typically not handled adequately by standard automated theorem provers. SMT is being recognized as increasingly important due to its applications in many domains in different communities, in particular in formal verification. An amount of papers with novel and very efficient techniques for SMT has been published in the last years, and some very efficient SMT tools are now available. Typical SMT (T) problems require testing the satisfiability of formulas which are Boolean combinations of atomic propositions and atomic expressions in T, so that heavy Boolean reasoning must be efficiently combined with expressive theoryspecific reasoning. The dominating approach to SMT (T), called lazy approach, is based on the integration of a SAT solver and of a decision procedure able to handle sets of atomic constraints in T (Tsolver), handling respectively the Boolean and the theoryspecific components of reasoning. Unfortunately, neither the problem of building an efficient SMT solver, nor even that
Deciding Combinations of Theories
 JOURNAL OF THE ACM
, 1984
"... A method is given for decidlng formulas in combinations of unquantified firstorder theories. Rather than coupling separate decision procedures for the contributing theories, the method makes use of a single, uniform procedure that minimizes the code needed to accommodate each additional theory. It ..."
Abstract

Cited by 174 (0 self)
 Add to MetaCart
(Show Context)
A method is given for decidlng formulas in combinations of unquantified firstorder theories. Rather than coupling separate decision procedures for the contributing theories, the method makes use of a single, uniform procedure that minimizes the code needed to accommodate each additional theory. It is apphcable to theories whose semantics an be encoded within a certain class of purely equational canonical form theories that ~s closed under combination. Examples are given from the equational theories of integer and real anthmeUc, a subtheory of monadic set theory, the theory of cons, car, and cdr, and others. A discussion of the speed performance of the procedure and a proof of the theorem that underhes ~ts completeness are also g~ven. The procedure has been used extensively asthe deductive core of a system for program specification and verification.
Integrating decision procedures into heuristic theorem provers: A case study of linear arithmetic
 Machine Intelligence
, 1988
"... We discuss the problem of incorporating into a heuristic theorem prover a decision procedure for a fragment of the logic. An obvious goal when incorporating such a procedure is to reduce the search space explored by the heuristic component of the system, as would be achieved by eliminating from the ..."
Abstract

Cited by 120 (10 self)
 Add to MetaCart
(Show Context)
We discuss the problem of incorporating into a heuristic theorem prover a decision procedure for a fragment of the logic. An obvious goal when incorporating such a procedure is to reduce the search space explored by the heuristic component of the system, as would be achieved by eliminating from the system’s data base some explicitly stated axioms. For example, if a decision procedure for linear inequalities is added, one would hope to eliminate the explicit consideration of the transitivity axioms. However, the decision procedure must then be used in all the ways the eliminated axioms might have been. The difficulty of achieving this degree of integration is more dependent upon the complexity of the heuristic component than upon that of the decision procedure. The view of the decision procedure as a &quot;black box &quot; is frequently destroyed by the need pass large amounts of search strategic information back and forth between the two components. Finally, the efficiency of the decision procedure may be virtually irrelevant; the efficiency of the final system may depend most heavily on how easy it is to communicate between the two components. This paper is a case study of how we integrated a linear arithmetic procedure into a heuristic theorem prover. By linear arithmetic here we mean the decidable subset of number theory dealing with universally quantified formulas composed of the logical connectives, the identity relation, the Peano &quot;less than &quot; relation, the Peano addition and subtraction functions, Peano constants,
Formal verification in hardware design: A survey
, 1997
"... In recent years, formal methods have emerged as an alternative approach to ensuring the quality and correctness of hardware designs, overcoming some of the limitations of traditional validation techniques such as simulation and testing. There are two main aspects to the application of formal methods ..."
Abstract

Cited by 113 (0 self)
 Add to MetaCart
In recent years, formal methods have emerged as an alternative approach to ensuring the quality and correctness of hardware designs, overcoming some of the limitations of traditional validation techniques such as simulation and testing. There are two main aspects to the application of formal methods in a design process: The formal framework used to specify desired properties of a design, and the verification techniques and tools used to reason about the relationship between a specification and a corresponding implementation. We survey a variety of frameworks and techniques which have been proposed in the literature and applied to actual designs. The specification frameworks we describe include temporal logics, predicate logic, abstraction and refinement, as well as containment between!regular languages. The verification techniques presented include model checking, automatatheoretic techniques, automated theorem proving, and approaches that integrate the above methods.
A New Correctness Proof of the NelsonOppen Combination Procedure
 Frontiers of Combining Systems, volume 3 of Applied Logic Series
, 1996
"... The NelsonOppen combination procedure, which combines satisfiability procedures for a class of firstorder theories by propagation of equalities between variables, is one of the most general combination methods in the field of theory combination. We describe a new nondeterministic version of the p ..."
Abstract

Cited by 93 (8 self)
 Add to MetaCart
(Show Context)
The NelsonOppen combination procedure, which combines satisfiability procedures for a class of firstorder theories by propagation of equalities between variables, is one of the most general combination methods in the field of theory combination. We describe a new nondeterministic version of the procedure that has been used to extend the Constraint Logic Programming Scheme to unions of constraint theories. The correctness proof of the procedure that we give in this paper not only constitutes a novel and easier proof of Nelson and Oppen's original results, but also shows that equality sharing between the satisfiability procedures of the component theories, the main idea of the method, can be confined to a restricted set of variables.
Nonlinear Array Dependence Analysis
, 1991
"... Standard array data dependence techniques can only reason about linear constraints. There has also been work on analyzing some dependences involving polynomial constraints. Analyzing array data dependences in realworld programs requires handling many "unanalyzable" terms: subscript arrays ..."
Abstract

Cited by 91 (7 self)
 Add to MetaCart
Standard array data dependence techniques can only reason about linear constraints. There has also been work on analyzing some dependences involving polynomial constraints. Analyzing array data dependences in realworld programs requires handling many "unanalyzable" terms: subscript arrays, runtime tests, function calls. The standard approach to analyzing such programs has been to omit and ignore any constraints that cannot be reasoned about. This is unsound when reasoning about valuebased dependences and whether privatization is legal. Also, this prevents us from determining the conditions that must be true to disprove the dependence. These conditions could be checked by a runtime test or verified by a programmer or aggressive, demanddriven interprocedural analysis. We describe a solution to these problems. Our solution makes our system sound and more accurate for analyzing valuebased dependences and derives conditions that can be used to disprove dependences. We also give some p...
BDD Based Procedures for a Theory of Equality with Uninterpreted Functions
"... . The logic of equality with uninterpreted functions has been proposed for verifying abstract hardware designs. The ability to perform fast satisfiability checking over this logic is imperative for this verification paradigm to be successful. We present symbolic methods for satisfiability checking f ..."
Abstract

Cited by 63 (4 self)
 Add to MetaCart
. The logic of equality with uninterpreted functions has been proposed for verifying abstract hardware designs. The ability to perform fast satisfiability checking over this logic is imperative for this verification paradigm to be successful. We present symbolic methods for satisfiability checking for this logic. The first procedure is based on restricting analysis to finite instantiations of the design. The second procedure directly reasons about equality by introducing Booleanvalued indicator variables for equality. Theoretical and experimental evidence shows the superiority of the second approach. 1 Verifying Highlevel Designs Using the Theory of Equality A common problem with automatic formal verification is that the computational resources required for verification increase rapidly with the size of the design. Stateof the art tools for verification of gatelevel designs are not capable of routinely verifying designs possessing more than a hundred to two hundred binaryvalued l...