Results 1  10
of
309
Automatic Verification of Pipelined Microprocessor Control
, 1994
"... We describe a technique for verifying the control logic of pipelined microprocessors. It handles more complicated designs, and requires less human intervention, than existing methods. The technique automaticMly compares a pipelined implementation to an architectural description. The CPU time nee ..."
Abstract

Cited by 260 (6 self)
 Add to MetaCart
We describe a technique for verifying the control logic of pipelined microprocessors. It handles more complicated designs, and requires less human intervention, than existing methods. The technique automaticMly compares a pipelined implementation to an architectural description. The CPU time needed for verification is independent of the data path width, the register file size, and the number of ALU operations.
ESP: PathSensitive Program Verification in Polynomial Time
, 2002
"... In this paper, we present a new algorithm for partial program verification that runs in polynomial time and space. We are interested in checking that a program satisfies a given temporal safety property. Our insight is that by accurately modeling only those branches in a program for which the proper ..."
Abstract

Cited by 250 (3 self)
 Add to MetaCart
In this paper, we present a new algorithm for partial program verification that runs in polynomial time and space. We are interested in checking that a program satisfies a given temporal safety property. Our insight is that by accurately modeling only those branches in a program for which the propertyrelated behavior differs along the arms of the branch, we can design an algorithm that is accurate enough to verify the program with respect to the given property, without paying the potentially exponential cost of full pathsensitive analysis. We have implemented this “property simulation ” algorithm as part of a partial verification tool called ESP. We present the results of applying ESP to the problem of verifying the file I/O behavior of a version of the GNU C compiler (gcc, 140,000 LOC). We are able to prove that all of the 646 calls to fprintf in the source code of gcc are guaranteed to print to valid, open files. Our results show that property simulation scales to large programs and is accurate enough to verify meaningful properties.
EXE: Automatically generating inputs of death
 In Proceedings of the 13th ACM Conference on Computer and Communications Security (CCS
, 2006
"... This article presents EXE, an effective bugfinding tool that automatically generates inputs that crash real code. Instead of running code on manually or randomly constructed input, EXE runs it on symbolic input initially allowed to be anything. As checked code runs, EXE tracks the constraints on ea ..."
Abstract

Cited by 226 (14 self)
 Add to MetaCart
This article presents EXE, an effective bugfinding tool that automatically generates inputs that crash real code. Instead of running code on manually or randomly constructed input, EXE runs it on symbolic input initially allowed to be anything. As checked code runs, EXE tracks the constraints on each symbolic (i.e., inputderived) memory location. If a statement uses a symbolic value, EXE does not run it, but instead adds it as an inputconstraint; all other statements run as usual. If code conditionally checks a symbolic expression, EXE forks execution, constraining the expression to be true on the true branch and false on the other. Because EXE reasons about all possible values on a path, it has much more power than a traditional runtime tool: (1) it can force execution down any feasible program path and (2) at dangerous operations (e.g., a pointer dereference), it detects if the current path constraints allow any value that causes a bug. When a path terminates or hits a bug, EXE automatically generates a test case by solving the current path constraints to find concrete values using its own codesigned constraint solver, STP. Because EXE’s constraints have no approximations, feeding this concrete input to an uninstrumented version of the checked code will cause it to follow the same path and hit the same bug (assuming deterministic code).
Validity Checking for Combinations of Theories with Equality
, 1996
"... . An essential component in many verification methods is a fast decision procedure for validating logical expressions. This paper presents the algorithm used in the Stanford Validity Checker (SVC) which has been used to aid several realistic hardware verification efforts. The logic for this decision ..."
Abstract

Cited by 151 (23 self)
 Add to MetaCart
. An essential component in many verification methods is a fast decision procedure for validating logical expressions. This paper presents the algorithm used in the Stanford Validity Checker (SVC) which has been used to aid several realistic hardware verification efforts. The logic for this decision procedure includes Boolean and uninterpreted functions and linear arithmetic. We have also successfully incorporated other interpreted functions, such as array operations and linear inequalities. The primary techniques which allow a complete and efficient implementation are expression sharing, heuristic rewriting, and congruence closure with interpreted functions. We discuss these techniques and present the results of initial experiments in which SVC is used as a decision procedure in PVS, resulting in dramatic speedups. 1 Introduction Decision procedures are emerging as a central component of formal verification systems. Such a procedure can be included as a component of a generalpurpos...
A decision procedure for bitvectors and arrays
 In Computer Aided Verification, number 4590 in LNCS
, 2007
"... Abstract. STP is a decision procedure for the satisfiability of quantifierfree formulas in the theory of bitvectors and arrays that has been optimized for large problems encountered in software analysis applications. The basic architecture of the procedure consists of wordlevel preprocessing alg ..."
Abstract

Cited by 128 (7 self)
 Add to MetaCart
Abstract. STP is a decision procedure for the satisfiability of quantifierfree formulas in the theory of bitvectors and arrays that has been optimized for large problems encountered in software analysis applications. The basic architecture of the procedure consists of wordlevel preprocessing algorithms followed by translation to SAT. The primary bottlenecks in software verification and bug finding applications are large arrays and linear bitvector arithmetic. New algorithms based on the abstractionrefinement paradigm are presented for reasoning about large arrays. A solver for bitvector linear arithmetic is presented that eliminates variables and parts of variables to enable other transformations, and reduce the size of the problem that is eventually received by the SAT solver. These and other algorithms have been implemented in STP, which has been heavily tested over thousands of examples obtained from several realworld applications. Experimental results indicate that the above mix of algorithms along with the overall architecture is far more effective, for a variety of applications, than a direct translation of the original formula to SAT or other comparable decision procedures. 1
Automated Deduction by Theory Resolution
 Journal of Automated Reasoning
, 1985
"... Theory resolution constitutes a set of complete procedures for incorporating theories into a resolution theoremproving program, thereby making it unnecessary to resolve directly upon axioms of the theory. This can greatly reduce the length of proofs and the size of the search space. Theory resoluti ..."
Abstract

Cited by 121 (1 self)
 Add to MetaCart
Theory resolution constitutes a set of complete procedures for incorporating theories into a resolution theoremproving program, thereby making it unnecessary to resolve directly upon axioms of the theory. This can greatly reduce the length of proofs and the size of the search space. Theory resolution effects a beneficial division of labor, improving the performance of the theorem prover and increasing the applicability of the specialized reasoning procedures. Total theory resolution utilizes a decision procedure that is capable of determining unsatisfiability of any set of clauses using predicates in the theory. Partial theory resolution employs a weaker decision procedure that can determine potential unsatisfiability of sets of literals. Applications include the building in of both mathematical and special decision procedures, e.g., for the taxonomic information furnished by a knowledge representation system. Theory resolution is a generalization of numerous previously known resolution refinements. Its power is demonstrated by comparing solutions of "Schubert's Steamroller" challenge problem with and without building in axioms through theory resolution. 1 1
CVC: a Cooperating Validity Checker
 In 14th International Conference on ComputerAided Verification
, 2002
"... Decision procedures for decidable logics and logical theories have proven to be useful tools in verification. This paper describes the CVC ("Cooperating Validity Checker") decision procedure. CVC implements a framework for combining subsidiary decision procedures for certain logical theories into a ..."
Abstract

Cited by 113 (18 self)
 Add to MetaCart
Decision procedures for decidable logics and logical theories have proven to be useful tools in verification. This paper describes the CVC ("Cooperating Validity Checker") decision procedure. CVC implements a framework for combining subsidiary decision procedures for certain logical theories into a decision procedure for the theories' union. Subsidiary decision procedures for theories of arrays, inductive datatypes, and linear real arithmetic are currently implemented. Other notable features of CVC are the incorporation of the highperformance Cha solver for propositional reasoning, and the ability to produce independently checkable proofs for valid formulas.
Temporal Planning with Continuous Change
, 1994
"... We present zeno, a least commitment planner that handles actions occurring over extended intervals of time. Deadline goals, metric preconditions, metric effects, and continuous change are supported. Simultaneous actions are allowed when their effects do not interfere. Unlike most planners that deal ..."
Abstract

Cited by 109 (10 self)
 Add to MetaCart
We present zeno, a least commitment planner that handles actions occurring over extended intervals of time. Deadline goals, metric preconditions, metric effects, and continuous change are supported. Simultaneous actions are allowed when their effects do not interfere. Unlike most planners that deal with complex languages, the zeno planning algorithm is sound and complete. The running code is a complete implementation of the formal algorithm, capable of solving simple problems (i.e., those involving less than a dozen steps). Introduction We have built a least commitment planner, zeno, that handles actions occuring over extended intervals of time and whose preconditions and effects can be temporally quantified. These capabilities enable zeno to reason about deadline goals, piecewiselinear continuous change, external events and to a limited extent, simultaneous actions. While other planners exist with some of these features, zeno is different because it is both sound and complete. As a...
Integrating decision procedures into heuristic theorem provers: A case study of linear arithmetic
 Machine Intelligence
, 1988
"... We discuss the problem of incorporating into a heuristic theorem prover a decision procedure for a fragment of the logic. An obvious goal when incorporating such a procedure is to reduce the search space explored by the heuristic component of the system, as would be achieved by eliminating from the ..."
Abstract

Cited by 107 (9 self)
 Add to MetaCart
We discuss the problem of incorporating into a heuristic theorem prover a decision procedure for a fragment of the logic. An obvious goal when incorporating such a procedure is to reduce the search space explored by the heuristic component of the system, as would be achieved by eliminating from the system’s data base some explicitly stated axioms. For example, if a decision procedure for linear inequalities is added, one would hope to eliminate the explicit consideration of the transitivity axioms. However, the decision procedure must then be used in all the ways the eliminated axioms might have been. The difficulty of achieving this degree of integration is more dependent upon the complexity of the heuristic component than upon that of the decision procedure. The view of the decision procedure as a "black box " is frequently destroyed by the need pass large amounts of search strategic information back and forth between the two components. Finally, the efficiency of the decision procedure may be virtually irrelevant; the efficiency of the final system may depend most heavily on how easy it is to communicate between the two components. This paper is a case study of how we integrated a linear arithmetic procedure into a heuristic theorem prover. By linear arithmetic here we mean the decidable subset of number theory dealing with universally quantified formulas composed of the logical connectives, the identity relation, the Peano "less than " relation, the Peano addition and subtraction functions, Peano constants,