Results 1  10
of
11
Probabilistic Constraint Handling Rules
, 2002
"... Classical Constraint Handling Rules (CHR) provide a powerful tool for specifying and implementing constraint solvers and programs. The rules of CHR rewrite constraints (nondeterministically) into simpler ones until they are solved. In this paper we introduce an extension of Constraint Handling Rule ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
Classical Constraint Handling Rules (CHR) provide a powerful tool for specifying and implementing constraint solvers and programs. The rules of CHR rewrite constraints (nondeterministically) into simpler ones until they are solved. In this paper we introduce an extension of Constraint Handling Rules (CHR), namely Probabilistic CHRs (PCHR). These allow the probabilistic “weighting ” of rules, specifying the probability of their application. In this way we are able to formalise various randomised algorithms such as for example Simulated Annealing. The implementation is based on sourcetosource transformation (STS). Using a recently developed prototype for STS for CHR, we could implement probabilistic CHR in a concise way with a few lines of code in less than one hour.
Concurrent Constraint Programming: Towards Probabilistic Abstract Interpretation
 Proc. of the 23rd International Symposium on Mathematical Foundations of Computer Science, MFCS'98, Lecture Notes in Computer Science
, 2000
"... We present a method for approximating the semantics of probabilistic programs to the purpose of constructing semanticsbased analyses of such programs. The method resembles the one based on Galois connection as developed in the Cousot framework for abstract interpretation. The main difference betwee ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
We present a method for approximating the semantics of probabilistic programs to the purpose of constructing semanticsbased analyses of such programs. The method resembles the one based on Galois connection as developed in the Cousot framework for abstract interpretation. The main difference between our approach and the standard theory of abstract interpretation is the choice of linear space structures instead of ordertheoretic ones as semantical (concrete and abstract) domains. We show that our method generates "best approximations" according to an appropriate notion of precision defined in terms of a norm. Moreover, if recasted in a ordertheoretic setting these approximations are correct in the sense of classical abstract interpretation theory. We use Concurrent ...
Probabilistic KLAIM
 In Proc. of 7th International Conference on Coordination Models and Languages (Coordination 04), volume 2949 of LNCS
, 2004
"... We introduce a probabilistic extension of KLAIM, where the behaviour of networks and individual nodes is determined by a probabilistic scheduler for processes and probabilistic allocation environments which describe the logical neighbourhood of each node. The resulting language has two variants ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
(Show Context)
We introduce a probabilistic extension of KLAIM, where the behaviour of networks and individual nodes is determined by a probabilistic scheduler for processes and probabilistic allocation environments which describe the logical neighbourhood of each node. The resulting language has two variants which are modelled respectively as discrete and continuous time Markov processes. We suggest that Poisson processes are a natural probabilistic model for the coordination of discrete processes asynchronously communicating in continuous time and we use them to de ne the operational semantics of the continuous time variant. This framework allows for the implementation of networks with independent clocks on each site.
Probabilistic Lindabased Coordination Languages
"... Coordination languages are intended to simplify the development of complex software systems by separating the coordination aspects of an application from its computation aspects. Coordination refers to the ways the independent active pieces of a program (e.g. a process, a task, a thread, etc.) commu ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Coordination languages are intended to simplify the development of complex software systems by separating the coordination aspects of an application from its computation aspects. Coordination refers to the ways the independent active pieces of a program (e.g. a process, a task, a thread, etc.) communicate and synchronise with each other. We review various approaches to introducing probabilistic or stochastic features in coordination languages. The main objective of such a study is to develop a semantic basis for a quantitative analysis of systems of interconnected or interacting components, which allows us to address not only the functional (qualitative) aspects of a system behaviour but also its nonfunctional aspects, typically considered in the realm of performance modelling and evaluation.
Quantum Constraint Programming
"... Abstract Quantum computers are hypothetical machines which can perform many calculations simultaneously based on quantummechanical principles that allows a single bit to coexist in many states at once. This enormous potential of quantum computing has attracted substantial interest, especially durin ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract Quantum computers are hypothetical machines which can perform many calculations simultaneously based on quantummechanical principles that allows a single bit to coexist in many states at once. This enormous potential of quantum computing has attracted substantial interest, especially during the last decade, and initiated a whole new field of research. As a contribution to this research we address the problem of the design of high level languages for programming quantum computers, and the definition of an appropriate formal semantics for such languages. To this purpose we consider the Constraint Programming paradigm and we show how computations in this paradigm can be seen as physical processes obeying the laws of quantum mechanics. 1 Introduction Quantum Computing is currently one of the hottest topics in computer science, physics and engineering. One of the reason for such an interest is the dramatic miniaturisation in computer technology over the past 40 years, which will inevitably force (if the trend continues) the use of quantum physics to describe the elementary operations of a computer. The by now already flourishing research in quantum computing has arisen in anticipation of this inevitable step. If technological issues have started the studies of the new computational model, the discovery of its potential has made the interest continuously increasing, especially after the presentation of Shor's quantum factorisation algorithm [30], and Grover's search algorithm [14]. These algorithms heavily exploits quantum mechanical phenomena. Thanks to quantum effects like superposition and entanglement, a function can be evaluated on a quantum machine so as all the outputs are computed in the time taken to evaluate just one output classically. Although a measurement of the final superposed state can yield only one output, it is still possible to obtain certain joint properties of all the outputs. As a consequence of this quantum parallelism, Shor and Grover's algorithms are faster than any known classical algorithm designed for solving the same problems.
Constraint Programs
"... Abstract We predict the maximal number of rule applications, i.e. worstcase derivation lengths of computations, in rulebased constraint solver programs written in the CHR language. CHR are a committedchoice concurrent constraint logic programming language consisting of multiheaded guarded rules. ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract We predict the maximal number of rule applications, i.e. worstcase derivation lengths of computations, in rulebased constraint solver programs written in the CHR language. CHR are a committedchoice concurrent constraint logic programming language consisting of multiheaded guarded rules. The derivation lengths are derived from rankings used in termination proofs for the respective programs. We are especially interested in rankings that give us a good upper bound, we call such rankings tight. Based on testruns with randomized data, we compare our predictions with empirical results by considering constraint solvers ranging from Boolean and terminological constraints to arcconsistency and pathconsistency.
Probabilistic Constraint Handling Rules
"... Abstract Classical Constraint Handling Rules (CHR) provide a powerful tool for specifying and implementing constraint solvers and programs. The rules of CHR rewrite constraints (nondeterministically) into simpler ones until they are solved. In this paper we introduce an extension of Constraint Hand ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Classical Constraint Handling Rules (CHR) provide a powerful tool for specifying and implementing constraint solvers and programs. The rules of CHR rewrite constraints (nondeterministically) into simpler ones until they are solved. In this paper we introduce an extension of Constraint Handling Rules (CHR), namely Probabilistic CHRs (PCHR). These allow the probabilistic &quot;weighting &quot; of rules, specifying the probability of their application. In this way we are able to formalise various randomised algorithms such as for example Simulated Annealing. The implementation is based on sourcetosource transformation (STS). Using a recently developed prototype for STS for CHR, we could implement probabilistic CHR in a concise way with a few lines of code in less than one hour. 1 Introduction Constraint Handling Rules (CHR) [7] are a committedchoice concurrent constraint logic programming language with ask and tell consisting of guarded rules that rewrite conjunctions of atomic formulas. CHR go beyond the CCP framework [24,25] in the sense that they allow for multiple atoms on the left hand side (lhs) of a rule and for propagation rules.
Approximate Probabilistic Confinement
"... SEMANTICS The use of an exact (collecting) semantics makes the analysis presented in the previous section precise: no approximation is introduced in the calculation of . ..."
Abstract
 Add to MetaCart
(Show Context)
SEMANTICS The use of an exact (collecting) semantics makes the analysis presented in the previous section precise: no approximation is introduced in the calculation of .
Linear Structures for Concurrency in Probabilistic Programming Languages
"... Abstract We introduce a semantical model based on operator algebras and we show the suitability of this model to capture both a quantitative version of nondeterminism (in the form of a probabilistic choice) and concurrency. We present the model by referring to a generic language which generalises v ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract We introduce a semantical model based on operator algebras and we show the suitability of this model to capture both a quantitative version of nondeterminism (in the form of a probabilistic choice) and concurrency. We present the model by referring to a generic language which generalises various probabilistic concurrent languages from different programming paradigms. We discuss the relation between concurrency and the commutativity of the resulting semantical domain. In particular, we use Gelfand's representation theorem to relate the semantical models of synchronisationfree and fully concurrent versions of the language. A central aspect of the model we present is that it allows for a unified view of both operational and denotational semantics for a concurrent language. 1 Introduction The purpose of this paper is to present the basic elements of a novel type of (nonstandard) semantics for probabilistic concurrent languages, which is based on linear spaces and exploits functional analytical and operator algebraic notions and results. We will use this setting to shed additional light on the role synchronisation plays in modelling concurrent languages. The semantics we propose can be seen as a denotational encoding of a transition system semantics, in as far as it represents transition graphs by linear operators (e.g. adjacency matrices) and at the same time allows for a compositional definition of these operators. In this way the usual distinction between operational vs denotational semantics becomes largely irrelevant. A similar weakening of this distinction can be found in a categorical context, e.g. in [61].