Results 1 - 10
of
777
Compositional Model Checking
, 1999
"... We describe a method for reducing the complexity of temporal logic model checking in systems composed of many parallel processes. The goal is to check properties of the components of a system and then deduce global properties from these local properties. The main difficulty with this type of approac ..."
Abstract
-
Cited by 3252 (70 self)
- Add to MetaCart
(Show Context)
We describe a method for reducing the complexity of temporal logic model checking in systems composed of many parallel processes. The goal is to check properties of the components of a system and then deduce global properties from these local properties. The main difficulty with this type of approach is that local properties are often not preserved at the global level. We present a general framework for using additional interface processes to model the environment for a component. These interface processes are typically much simpler than the full environment of the component. By composing a component with its interface processes and then checking properties of this composition, we can guarantee that these properties will be preserved at the global level. We give two example compositional systems based on the logic CTL*.
A calculus of mobile processes, I
, 1992
"... We present the a-calculus, a calculus of communicating systems in which one can naturally express processes which have changing structure. Not only may the component agents of a system be arbitrarily linked, but a communication between neighbours may carry information which changes that linkage. The ..."
Abstract
-
Cited by 1184 (31 self)
- Add to MetaCart
We present the a-calculus, a calculus of communicating systems in which one can naturally express processes which have changing structure. Not only may the component agents of a system be arbitrarily linked, but a communication between neighbours may carry information which changes that linkage. The calculus is an extension of the process algebra CCS, following work by Engberg and Nielsen, who added mobility to CCS while preserving its algebraic properties. The rr-calculus gains simplicity by removing all distinction between variables and constants; communication links are identified by names, and computation is represented purely as the communication of names across links. After an illustrated description of how the n-calculus generalises conventional process algebras in treating mobility, several examples exploiting mobility are given in some detail. The important examples are the encoding into the n-calculus of higher-order functions (the I-calculus and com-binatory algebra), the transmission of processes as values, and the representation of data structures as processes. The paper continues by presenting the algebraic theory of strong bisimilarity and strong equivalence, including a new notion of equivalence indexed by distinctions-i.e., assumptions of inequality among names. These theories are based upon a semantics in terms of a labeled transition system and a notion of strong bisimulation, both of which are expounded in detail in a companion paper. We also report briefly on work-in-progress based upon the corresponding notion of weak bisimulation, in which internal actions cannot be observed.
Bisimulation through probabilistic testing
- in “Conference Record of the 16th ACM Symposium on Principles of Programming Languages (POPL
, 1989
"... We propose a language for testing concurrent processes and examine its strength in terms of the processes that are distinguished by a test. By using probabilistic transition systems as the underlying semantic model, we show how a testing algorithm can distinguish, with a probability arbitrarily clos ..."
Abstract
-
Cited by 529 (14 self)
- Add to MetaCart
We propose a language for testing concurrent processes and examine its strength in terms of the processes that are distinguished by a test. By using probabilistic transition systems as the underlying semantic model, we show how a testing algorithm can distinguish, with a probability arbitrarily close to one, between processes that are not bisimulation equivalent. We also show a similar result (in a slightly stronger form) for a new process relation called $-bisimulation-which lies strictly between that of simulation and bisimulation. Finally, the ultimately strength of the testing language is shown to identify a new process relation called probabilistic bisimulation-which is strictly stronger than bisimulation. li? 1991 Academic Press. Inc. 1.
Introduction to the ISO specification language Lotos
- Computer Networks
, 1988
"... ..."
(Show Context)
Process algebra for synchronous communication
- Inform. and Control
, 1984
"... Within the context of an algebraic theory of processes, an equational specification of process cooperation is provided. Four cases are considered: free merge or interleaving, merging with communication, merging with mutual exclusion of tight regions, and synchronous process cooperation. The rewrite ..."
Abstract
-
Cited by 426 (68 self)
- Add to MetaCart
Within the context of an algebraic theory of processes, an equational specification of process cooperation is provided. Four cases are considered: free merge or interleaving, merging with communication, merging with mutual exclusion of tight regions, and synchronous process cooperation. The rewrite system behind the communication algebra is shown to be confluent and terminating (modulo its permutative reductions). Further, some relationships are shown to hold between the four concepts of merging. © 1984 Academic Press, Inc.
Universal coalgebra: a theory of systems
, 2000
"... In the semantics of programming, nite data types such as finite lists, have traditionally been modelled by initial algebras. Later final coalgebras were used in order to deal with in finite data types. Coalgebras, which are the dual of algebras, turned out to be suited, moreover, as models for certa ..."
Abstract
-
Cited by 408 (42 self)
- Add to MetaCart
In the semantics of programming, nite data types such as finite lists, have traditionally been modelled by initial algebras. Later final coalgebras were used in order to deal with in finite data types. Coalgebras, which are the dual of algebras, turned out to be suited, moreover, as models for certain types of automata and more generally, for (transition and dynamical) systems. An important property of initial algebras is that they satisfy the familiar principle of induction. Such a principle was missing for coalgebras until the work of Aczel (Non-Well-Founded sets, CSLI Leethre Notes, Vol. 14, center for the study of Languages and information, Stanford, 1988) on a theory of non-wellfounded sets, in which he introduced a proof principle nowadays called coinduction. It was formulated in terms of bisimulation, a notion originally stemming from the world of concurrent programming languages. Using the notion of coalgebra homomorphism, the definition of bisimulation on coalgebras can be shown to be formally dual to that of congruence on algebras. Thus, the three basic notions of universal algebra: algebra, homomorphism of algebras, and congruence, turn out to correspond to coalgebra, homomorphism of coalgebras, and bisimulation, respectively. In this paper, the latter are taken
The Linear Time-Branching Time Spectrum II - The semantics of sequential systems with silent moves
, 1993
"... ion Rule (KFAR) (Baeten, Bergstra & Klop [3]), expresses a global fairness assumption. It says that when possible a system will escape from any cycle of internal actions. Some form of KFAR is crucial for many protocal verifications with unreliable channels, and for that reason preorders and equi ..."
Abstract
-
Cited by 375 (21 self)
- Add to MetaCart
ion Rule (KFAR) (Baeten, Bergstra & Klop [3]), expresses a global fairness assumption. It says that when possible a system will escape from any cycle of internal actions. Some form of KFAR is crucial for many protocal verifications with unreliable channels, and for that reason preorders and equivalences that satisfy KFAR are of special interest. Must preorders and divergence sensitive ones cannot satisfy KFAR. In Bergstra, Klop & Olderog [7] it is shown that the combination of KFAR with failure semantics is inconsistent, but they formulate a weaker version of KFAR that is satisfied in failure may-semantics. Still the combination of KFAR \Gamma and the liveness requirement appears to require global testing, and is only satisfied in the semantics between contrasimulation (C) and stability respecting branching bisimulation (BB s ). These requirements would reduce the number of suitable preorders to 18. It is in general a good strategy to do your verifications using the finest preorde...
Reactive, Generative and Stratified Models of Probabilistic Processes
- Information and Computation
, 1990
"... ion Let E; E 0 be PCCS expressions. The inter-model abstraction rule IMARGR is defined by E ff[p] \Gamma\Gamma! i E 0 =) E ff[p= G (E;fffg)] ae \Gamma\Gamma\Gamma\Gamma\Gamma\Gamma! i E 0 This rule uses the generative normalization function to convert generative probabilities to reactive ..."
Abstract
-
Cited by 195 (8 self)
- Add to MetaCart
ion Let E; E 0 be PCCS expressions. The inter-model abstraction rule IMARGR is defined by E ff[p] \Gamma\Gamma! i E 0 =) E ff[p= G (E;fffg)] ae \Gamma\Gamma\Gamma\Gamma\Gamma\Gamma! i E 0 This rule uses the generative normalization function to convert generative probabilities to reactive ones, thereby abstracting away from the relative probabilities between different actions. We can now define 'GR ('G (P )) as the reactive transition system that can be inferred from P 's generative transition system via IMARGR . By the same procedure as described at the end of Section 3.1, 'GR can be extended to a mapping 'GR : j GG ! j GR . Write P GR ¸ Q if P; Q 2 Pr are reactive bisimulation equivalent with respect to the transitions derivable from G+IMARGR , i.e. the theory obtained by adding IMARGR to the rules of Figure 7. The equivalence GR ¸ is defined just like R ¸ but using the cPDF ¯GR instead of ¯R . ¯GR is defined by ¯GR (P; ff; S) = X i2I R (=I G ) fj p i j G+ I...
On reduction-based process semantics
- in Proceedings of FSTTCS ’93, LNCS 761
, 1995
"... Abstract. A formulation of semantic theories for processes which is based on reduction relation and equational reasoning is studied. The new construction can induce meaningful theories for processes, both in strong and weak settings. The resulting theories in many cases coincide with, and sometimes ..."
Abstract
-
Cited by 160 (27 self)
- Add to MetaCart
Abstract. A formulation of semantic theories for processes which is based on reduction relation and equational reasoning is studied. The new construction can induce meaningful theories for processes, both in strong and weak settings. The resulting theories in many cases coincide with, and sometimes generalise, observation-based formulation of behavioural equivalence. The basic construction of reduction-based theories is studied, taking a simple name passing calculus called $\nu$-calculus as an example. Results on other calculi are also briefly discussed. 1