Results 1  10
of
31
Module Checking
, 1996
"... . In computer system design, we distinguish between closed and open systems. A closed system is a system whose behavior is completely determined by the state of the system. An open system is a system that interacts with its environment and whose behavior depends on this interaction. The ability of ..."
Abstract

Cited by 109 (12 self)
 Add to MetaCart
. In computer system design, we distinguish between closed and open systems. A closed system is a system whose behavior is completely determined by the state of the system. An open system is a system that interacts with its environment and whose behavior depends on this interaction. The ability of temporal logics to describe an ongoing interaction of a reactive program with its environment makes them particularly appropriate for the specification of open systems. Nevertheless, modelchecking algorithms used for the verification of closed systems are not appropriate for the verification of open systems. Correct model checking of open systems should check the system with respect to arbitrary environments and should take into account uncertainty regarding the environment. This is not the case with current modelchecking algorithms and tools. In this paper we introduce and examine the problem of model checking of open systems (mod ule checking, for short). We show that while module che...
Bisimulation and Model Checking
 In Proc. Compositionality Workshop, LNCS 1536
, 1999
"... State space minimization techniques are crucial for combating state explosion. A variety of verification tools use bisimulation minimization to check equivalence between systems, to minimize components before composition, or to reduce a state space prior to model checking. This paper explores the th ..."
Abstract

Cited by 75 (11 self)
 Add to MetaCart
State space minimization techniques are crucial for combating state explosion. A variety of verification tools use bisimulation minimization to check equivalence between systems, to minimize components before composition, or to reduce a state space prior to model checking. This paper explores the third use in the context of verifying invariant properties. We consider three bisimulation minimization algorithms. From each, we produce an onthefly model checker for invariant properties and compare this model checker to a conventional one based on backwards reachability. Our comparisons, both theoretical and experimental, lead us to conclude that bisimulation minimization does not appear to be viable in the context of invariance verification, because performing the minimization requires as many, if not more, computational resources as model checking the unminimized system through backwards reachability. Keywords: Bisimulation minimization, model checking, invariant properties, onthefly...
Symbolic Trajectory Evaluation
 Formal Hardware Verification
, 1996
"... ion The main problem with model checking is the state explosion problem  the state space grows exponentially with system size. Two methods have some popularity in attacking this problem: compositional methods and abstraction. While they cannot solve the problem in general, they do offer significa ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
(Show Context)
ion The main problem with model checking is the state explosion problem  the state space grows exponentially with system size. Two methods have some popularity in attacking this problem: compositional methods and abstraction. While they cannot solve the problem in general, they do offer significant improvements in performance. The direct method of verifying that a circuit has a property f is to show the model M satisfies f . The idea behind abstraction is that instead of verifying property f of model M , we verify property f A of model MA and the answer we get helps us answer the original problem. The system MA is an abstraction of the system M . One possibility is to build an abstraction MA that is equivalent (e.g. bisimilar [48]) to M . This sometimes leads to performance advantages if the state space of MA is smaller than M . This type of abstraction would more likely be used in model comparison (e.g. as in [38]). Typically, the behaviour of an abstraction is not equivalent...
An AutomataTheoretic Approach to Modular Model Checking
, 1998
"... this paper we consider assumeguarantee specifications in which the guarantee is specified by branching temporal formulas. We distinguish between two approaches. In the first approach, the assumption is specified by branching temporal formulas too. In the second approach, the assumption is specified ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
this paper we consider assumeguarantee specifications in which the guarantee is specified by branching temporal formulas. We distinguish between two approaches. In the first approach, the assumption is specified by branching temporal formulas too. In the second approach, the assumption is specified by linear temporal logic. We consider guarantees in 8CTL and 8CTL
Simulation Based Minimization
 Proc. 17th Int'l Conference on Automated Deduction (CADE'00), volume 1831 of LNCS
, 2000
"... . This work presents a minimization algorithm. The algorithm receives a Kripke structure M and returns the smallest structure that is simulation equivalent to M . The simulation equivalence relation is weaker than bisimulation but stronger than the simulation preorder. It strongly preserves ACTL ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
(Show Context)
. This work presents a minimization algorithm. The algorithm receives a Kripke structure M and returns the smallest structure that is simulation equivalent to M . The simulation equivalence relation is weaker than bisimulation but stronger than the simulation preorder. It strongly preserves ACTL and LTL (as sublogics of ACTL ). We show that every structure M has a unique up to isomorphism reduced structure that is simulation equivalent to M and smallest in size. We give a Minimizing Algorithm that constructs the reduced structure. It first constructs the quotient structure for M , then eliminates transitions to little brothers and finally deletes unreachable states. The first step has maximal space requirements since it is based on the simulation preorder over M . To reduce these requirements we suggest the Partitioning Algorithm which constructs the quotient structure for M without ever building the simulation preorder. The Partitioning Algorithm has a better space com...
Exploiting Powerup Delay for Sequential Optimization
 in Proc. European Design Automation Conf., (Brighton,Great Britain
, 1995
"... Recent work has identified the notion of safe replacement for sequential synchronous designs that may not have reset hardware or even explicitly known initial states. Safe replacement requires that a replacement design be indistinguishable from the original from the very first clock cycle after pow ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
Recent work has identified the notion of safe replacement for sequential synchronous designs that may not have reset hardware or even explicitly known initial states. Safe replacement requires that a replacement design be indistinguishable from the original from the very first clock cycle after powerup. However, in almost any realistic application, the design is allowed to stabilize for many clock cycles before it is used. In this paper, we investigate the safety of a replacement if the replacement design is allowed to be clocked some cycles (that is, delayed) with arbitrary inputs before the design is reset. Having argued the safety of “delay ” replacements, we investigate a new method of sequential optimization based upon the notion. We present experimental results to demonstrate that significant area optimizations can be gained by using this new notion of delay replaceability, and that there is a tradeoff between the allowed number of clock cycles after powerup and the amount of optimization that can be obtained. 1
Efficient Formal Design Verification: Data Structure + Algorithms
, 1994
"... We describe a data structure and a set of BDD based algorithms for efficient formal design verification. We argue that hardware designs should be translated into an intermediate hierarchical netlist of combinational tables and sequential elements, and internally represented by a flattened network of ..."
Abstract

Cited by 10 (8 self)
 Add to MetaCart
We describe a data structure and a set of BDD based algorithms for efficient formal design verification. We argue that hardware designs should be translated into an intermediate hierarchical netlist of combinational tables and sequential elements, and internally represented by a flattened network of gates and latches, akin to that in SIS [32]. We establish that the core computation in BDD based formal design verification is forming the image and preimage of sets of states under the transition relation characterizing the design. To make this step efficient, we address BDD variable ordering, use of partitioned transition relations, use of clustering, use of don't cares, and redundant latch removal. Many of these techniques have been studied in the past. We provide a complete integrated set of modified algorithms and give references andcomparisons with previous work. We report experimental results on a series of seven industrial examples containing from 28 to 172 binary valued latches. ...
Selective mucalculus: New Modal Operators for Proving Properties on Reduced Transition Systems
 In Proceedings of FORTE X/PSTV XVII '97. Chapman
, 1997
"... In model checking for temporal logic, the correctness of a (concurrent) system with respect to a desired behavior is verified by checking whether a structure that models the system satisfies a formula describing the behaviour. Most existing verification techniques, and in particular those defined fo ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
(Show Context)
In model checking for temporal logic, the correctness of a (concurrent) system with respect to a desired behavior is verified by checking whether a structure that models the system satisfies a formula describing the behaviour. Most existing verification techniques, and in particular those defined for concurrent calculi like as CCS, are based on a representation of the concurrent system by means of a labelled transition system. In this approach to verification, state explosion is one of the most serious problems. In this paper we present a new temporal logic, the selective mucalculus, with the property that only the actions occurring in a formula are relevant to check the formula itself. We prove that the selective mucalculus is as powerful as the mucalculus. We define the notion of aebisimulation between transition systems: given a set of actions ae, a transition system aebisimulates another one if they have the same behaviour with respect to the actions in ae. We prove that, if t...
Model Checking with formuladependent abstract models
 In ComputerAided Verification (CAV), volume 2102 of LNCS
, 2001
"... We present a model checking algorithm for ∀CTL (and full CTL) which uses an iterative abstraction refinement strategy. In each iteration we call a standard model checker for the abstract models A_i. If A_i does not satisfy Φ we refine the abstract model A_i yielding another abstra ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We present a model checking algorithm for &forall;CTL (and full CTL) which uses an iterative abstraction refinement strategy. In each iteration we call a standard model checker for the abstract models A_i. If A_i does not satisfy &Phi; we refine the abstract model A_i yielding another abstract model A_i+1 and (re)call the model checker to A_i+1. Otherwise the formula holds for the original system M. Our algorithm terminates at least for all transition systems M that have a finite simulation or bisimulation quotient. In contrast to other abstraction refinement algorithms, we always work with abstract models whose size just depend on the length of the formula &Phi; (but not on the size of the system which might be infinite).
Modular minimization of deterministic finitestate machines
 In 6th International Workshop on Formal Methods for Industrial Critical Systems
, 2001
"... Abstract This work presents a modular technique for minimizing a deterministic finitestate machine (FSM) while preserving its equivalence to the original system. Being modular, the minimization technique should consume less time and space. Preserving equivalence, the resulting minimized model can b ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Abstract This work presents a modular technique for minimizing a deterministic finitestate machine (FSM) while preserving its equivalence to the original system. Being modular, the minimization technique should consume less time and space. Preserving equivalence, the resulting minimized model can be employed in both temporal logic model checking and sequential equivalence checking, thus reducing their time and space consumption. As deterministic FSMs are commonly used for modeling hardware designs, our approach is suitable for formal verification of such designs. We develop a new BDD framework for the representation and manipulation of functions and show how our minimization technique can be effectively implemented within this framework. 1 Introduction Due to the fast development of the hardware and software industry, there is a growing need for formal verification tools and techniques. Two widely used formal verification methods are temporal logic model checking and sequential equivalence checking. Temporal logic model checking is a method for verifying finitestate systems with respect to propositional temporal logic specifications. In sequential equivalence checking, two sequential hardware designs are compared for language equivalence, meaning that for every sequence of inputs, the two designs produce the same sequence of outputs. Both model checking and equivalence checking are fully automatic. However, they both suffer from the state explosion problem, that is to say, their space requirements are high and limit their applicability to large systems. Many approaches for overcoming the state explosion problem have been suggested, including abstraction, partial order reduction, modular verification methods, and symmetry [5]. All are aimed at reducing the size of the model to which formal verification methods are applied, thus extending their applicability to larger systems. When reduction methods are applied, the verification technique has to be able to deduce properties of the system by verifying the reduced model. We therefore require the result of the reduction to be equivalent to the original model.