Results 1  10
of
79
Domain Theory in Logical Form
 Annals of Pure and Applied Logic
, 1991
"... The mathematical framework of Stone duality is used to synthesize a number of hitherto separate developments in Theoretical Computer Science: • Domain Theory, the mathematical theory of computation introduced by Scott as a foundation for denotational semantics. • The theory of concurrency and system ..."
Abstract

Cited by 228 (10 self)
 Add to MetaCart
The mathematical framework of Stone duality is used to synthesize a number of hitherto separate developments in Theoretical Computer Science: • Domain Theory, the mathematical theory of computation introduced by Scott as a foundation for denotational semantics. • The theory of concurrency and systems behaviour developed by Milner, Hennessy et al. based on operational semantics. • Logics of programs. Stone duality provides a junction between semantics (spaces of points = denotations of computational processes) and logics (lattices of properties of processes). Moreover, the underlying logic is geometric, which can be computationally interpreted as the logic of observable properties—i.e. properties which can be determined to hold of a process on the basis of a finite amount of information about its execution. These ideas lead to the following programme:
A Logic for Reasoning about Probabilities
 Information and Computation
, 1990
"... We consider a language for reasoning about probability which allows us to make statements such as “the probability of E, is less than f ” and “the probability of E, is at least twice the probability of E,, ” where E, and EZ are arbitrary events. We consider the case where all events are measurable ( ..."
Abstract

Cited by 214 (21 self)
 Add to MetaCart
We consider a language for reasoning about probability which allows us to make statements such as “the probability of E, is less than f ” and “the probability of E, is at least twice the probability of E,, ” where E, and EZ are arbitrary events. We consider the case where all events are measurable (i.e., represent measurable sets) and the more general case, which is also of interest in practice, where they may not be measurable. The measurable case is essentially a formalization of (the propositional fragment of) Nilsson’s probabilistic logic. As we show elsewhere, the general (nonmeasurable) case corresponds precisely to replacing probability measures by DempsterShafer belief functions. In both cases, we provide a complete axiomatization and show that the problem of deciding satistiability is NPcomplete, no worse than that of propositional logic. As a tool for proving our complete axiomatizations, we give a complete axiomatization for reasoning about Boolean combinations of linear inequalities, which is of independent interest. This proof and others make crucial use of results from the theory of linear programming. We then extend the language to allow reasoning about conditional probability and show that the resulting logic is decidable and completely axiomatizable, by making use of the theory of real closed fields. ( 1990 Academic Press. Inc 1.
Reasoning about Knowledge and Probability
 Journal of the ACM
, 1994
"... : We provide a model for reasoning about knowledge and probability together. We allow explicit mention of probabilities in formulas, so that our language has formulas that essentially say "according to agent i, formula ' holds with probability at least b." The language is powerful enough to allow r ..."
Abstract

Cited by 155 (16 self)
 Add to MetaCart
: We provide a model for reasoning about knowledge and probability together. We allow explicit mention of probabilities in formulas, so that our language has formulas that essentially say "according to agent i, formula ' holds with probability at least b." The language is powerful enough to allow reasoning about higherorder probabilities, as well as allowing explicit comparisons of the probabilities an agent places on distinct events. We present a general framework for interpreting such formulas, and consider various properties that might hold of the interrelationship between agents' probability assignments at different states. We provide a complete axiomatization for reasoning about knowledge and probability, prove a small model property, and obtain decision procedures. We then consider the effects of adding common knowledge and a probabilistic variant of common knowledge to the language. A preliminary version of this paper appeared in the Proceedings of the Second Conference on T...
Model Checking for a Probabilistic Branching Time Logic with Fairness
 Distributed Computing
, 1998
"... We consider concurrent probabilistic systems, based on probabilistic automata of Segala & Lynch [55], which allow nondeterministic choice between probability distributions. These systems can be decomposed into a collection of "computation trees" which arise by resolving the nondeterministic, but n ..."
Abstract

Cited by 115 (36 self)
 Add to MetaCart
We consider concurrent probabilistic systems, based on probabilistic automata of Segala & Lynch [55], which allow nondeterministic choice between probability distributions. These systems can be decomposed into a collection of "computation trees" which arise by resolving the nondeterministic, but not probabilistic, choices. The presence of nondeterminism means that certain liveness properties cannot be established unless fairness is assumed. We introduce a probabilistic branching time logic PBTL, based on the logic TPCTL of Hansson [30] and the logic PCTL of [55], resp. pCTL of [14]. The formulas of the logic express properties such as "every request is eventually granted with probability at least p". We give three interpretations for PBTL on concurrent probabilistic processes: the first is standard, while in the remaining two interpretations the branching time quantifiers are taken to range over a certain kind of fair computation trees. We then present a model checking algorithm for...
Algebraic Reasoning for Probabilistic Concurrent Systems
 Proc. IFIP TC2 Working Conference on Programming Concepts and Methods
, 1990
"... We extend Milner's SCCS to obtain a calculus, PCCS, for reasoning about communicating probabilistic processes. In particular, the nondeterministic process summation operator of SCCS is replaced with a probabilistic one, in which the probability of behaving like a particular summand is given explicit ..."
Abstract

Cited by 94 (5 self)
 Add to MetaCart
We extend Milner's SCCS to obtain a calculus, PCCS, for reasoning about communicating probabilistic processes. In particular, the nondeterministic process summation operator of SCCS is replaced with a probabilistic one, in which the probability of behaving like a particular summand is given explicitly. The operational semantics for PCCS is based on the notion of probabilistic derivation, and is given structurally as a set of inference rules. We then present an equational theory for PCCS based on probabilistic bisimulation, an extension of Milner's bisimulation proposed by Larsen and Skou. We provide the first axiomatization of probabilistic bisimulation, a subset of which is relatively complete for finitestate probabilistic processes. In the probabilistic case, a notion of processes with almost identical behavior (i.e., with probability 1 \Gamma ffl, for ffl sufficiently small) appears to be more useful in practice than a notion of equivalence, since the latter is often too restricti...
A Per Model of Secure Information Flow in Sequential Programs
 HIGHERORDER AND SYMBOLIC COMPUTATION
, 1998
"... This paper proposes an extensional semanticsbased formal specification of secure informationflow properties in sequential programs based on representing degrees of security by partial equivalence relations (pers). The specification clarifies and unifies a number of specific correctness arguments i ..."
Abstract

Cited by 93 (19 self)
 Add to MetaCart
This paper proposes an extensional semanticsbased formal specification of secure informationflow properties in sequential programs based on representing degrees of security by partial equivalence relations (pers). The specification clarifies and unifies a number of specific correctness arguments in the literature and connections to other forms of program analysis. The approach is inspired by (and in the deterministic case equivalent to) the use of partial equivalence relations in specifying bindingtime analysis, and is thus able to specify security properties of higherorder functions and "partially confidential data". We also show how the per approach can handle nondeterminism for a firstorder language, by using powerdomain semantics and show how probabilistic security properties can be formalised by using probabilistic powerdomain semantics. We illustrate the usefulness of the compositional nature of the security specifications by presenting a straightforward correctness proof for a simple typebased security analysis.
Symbolic model checking for probabilistic processes
 IN PROCEEDINGS OF ICALP '97
, 1997
"... We introduce a symbolic model checking procedure for Probabilistic Computation Tree Logic PCTL over labelled Markov chains as models. Model checking for probabilistic logics typically involves solving linear equation systems in order to ascertain the probability of a given formula holding in a stat ..."
Abstract

Cited by 81 (28 self)
 Add to MetaCart
We introduce a symbolic model checking procedure for Probabilistic Computation Tree Logic PCTL over labelled Markov chains as models. Model checking for probabilistic logics typically involves solving linear equation systems in order to ascertain the probability of a given formula holding in a state. Our algorithm is based on the idea of representing the matrices used in the linear equation systems by MultiTerminal Binary Decision Diagrams (MTBDDs) introduced in Clarke et al [14]. Our procedure, based on the algorithm used by Hansson and Jonsson [24], uses BDDs to represent formulas and MTBDDs to represent Markov chains, and is efficient because it avoids explicit state space construction. A PCTL model checker is being implemented in Verus [9].
On probabilistic model checking
, 1996
"... Abstract. This tutorial presents an overview of model checking for both discrete and continuoustime Markov chains (DTMCs and CTMCs). Model checking algorithms are given for verifying DTMCs and CTMCs against specifications written in probabilistic extensions of temporal logic, including quantitative ..."
Abstract

Cited by 56 (7 self)
 Add to MetaCart
Abstract. This tutorial presents an overview of model checking for both discrete and continuoustime Markov chains (DTMCs and CTMCs). Model checking algorithms are given for verifying DTMCs and CTMCs against specifications written in probabilistic extensions of temporal logic, including quantitative properties with rewards. Example properties include the probability that a fault occurs and the expected number of faults in a given time period. We also describe the practical application of stochastic model checking with the probabilistic model checker PRISM by outlining the main features supported by PRISM and three realworld case studies: a probabilistic security protocol, dynamic power management and a biological pathway. 1
The Metric Analogue of Weak Bisimulation for Probabilistic Processes
, 2002
"... We observe that equivalence is not a robust concept in the presence of numerical information  such as probabilities  in the model. We develop a metric analogue of weak bisimulation in the spirit of our earlier work on metric analogues for strong bisimulation. We give a fixed point characterization ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
We observe that equivalence is not a robust concept in the presence of numerical information  such as probabilities  in the model. We develop a metric analogue of weak bisimulation in the spirit of our earlier work on metric analogues for strong bisimulation. We give a fixed point characterization of the metric. This makes available coinductive reasoning principles and allows us to prove metric analogues of the usual algebraic laws for process combinators. We also show that quantitative properties of interest are continuous with respect to the metric, which says that if two processes are close in the metric then observable quantitative properties of interest are indeed close. As an important example of this we show that nearby processes have nearby channel capacities  a quantitative measure of their propensity to leak information.
Algorithmic Knowledge
 Proc. Second Conference on Theoretical Aspects of Reasoning about Knowledge
, 1994
"... : The standard model of knowledge in multiagent systems suffers from what has been called the logical omniscience problem: agents know all tautologies, and know all the logical consequences of their knowledge. For many types of analysis, this turns out not to be a problem. Knowledge is viewed as be ..."
Abstract

Cited by 50 (10 self)
 Add to MetaCart
: The standard model of knowledge in multiagent systems suffers from what has been called the logical omniscience problem: agents know all tautologies, and know all the logical consequences of their knowledge. For many types of analysis, this turns out not to be a problem. Knowledge is viewed as being ascribed by the system designer to the agents; agents are not assumed to compute their knowledge in any way, nor is it assumed that they can necessarily answer questions based on their knowledge. Nevertheless, in many applications that we are interested in, agents need to act on their knowledge. In such applications, an externally ascribed notion of knowledge is insufficient: clearly an agent can base his actions only on what he explicitly knows. Furthermore, an agent that has to act on his knowledge has to be able to compute this knowledge; we do need to take into account the algorithms available to the agent, as well as the "effort" required to compute knowledge. In this paper, we show...