Results 11  20
of
66
Semipullbacks and Bisimulation in Categories of Markov Processes
, 1999
"... this paper, we show that the answer to the above question is positive. More specifically, we give a canonical construction for semipullbacks in the category whose objects are families of Markov processes, with given transition kernels, on Polish spaces and whose morphisms are transition probability ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
this paper, we show that the answer to the above question is positive. More specifically, we give a canonical construction for semipullbacks in the category whose objects are families of Markov processes, with given transition kernels, on Polish spaces and whose morphisms are transition probability preserving surjective continuous maps. One immediate consequence is that the category of probability measures on Polish spaces with measurepreserving continuous maps has semipullbacks. Our construction gives semipullbacks for various full subcategories, including that of Markov processes on locally compact second countable spaces and also in the larger category where the objects are Markov processes on analytic spaces (i.e. continuous images of Polish spaces) and morphisms are transition probability preserving surjective Borel maps. It also applies to the corresponding categories of ultrametric spaces. Finally, our result also holds in the larger categories with Markov processes which are given by subprobability distributions, i.e. the total probability of transition from a state can be strictly less than one. We now explain the relevance of our result in computer science. The consequences of Semipullbacks and Bisimulation 3 our mathematical result in the theory of probabilistic bisimulation has been investigated in (Blute et al., 1997; Desharnais et al., 1998). We will briefly review this here. Following the work of Joyal, Nielsen and Winskel (Joyal et al., 1996) on the notion of bisimulation using open maps, define two objects A and B in a category to be bisimular if there exists an object C and morphisms f : C ! A and g : C ! B, i.e.,
Uncertain<T>: A firstorder type for uncertain data
 In ASPLOS
, 2014
"... Emerging applications increasingly use estimates such as sensor data (GPS), probabilistic models, machine learning, big data, and human data. Unfortunately, representing this uncertain data with discrete types (floats, integers, and booleans) encourages developers to pretend it is not probabilisti ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
(Show Context)
Emerging applications increasingly use estimates such as sensor data (GPS), probabilistic models, machine learning, big data, and human data. Unfortunately, representing this uncertain data with discrete types (floats, integers, and booleans) encourages developers to pretend it is not probabilistic, which causes three types of uncertainty bugs. (1) Using estimates as facts ignores random error in estimates. (2) Computation compounds that error. (3) Boolean questions on probabilistic data induce false positives and negatives. This paper introduces Uncertain〈T〉, a new programming language abstraction for uncertain data. We implement a Bayesian network semantics for computation and conditionals that improves program correctness. The runtime uses sampling and hypothesis tests to evaluate computation and conditionals lazily and efficiently. We illustrate with sensor and machine learning applications that Uncertain〈T 〉 improves expressiveness and accuracy. Whereas previous probabilistic programming languages focus on experts, Uncertain〈T 〉 serves a wide range of developers. Experts still identify error distributions. However, both experts and application writers compute with distributions, improve estimates with domain knowledge, and ask questions with conditionals. The Uncertain〈T 〉 type system and operators encourage developers to expose and reason about uncertainty explicitly, controlling false positives and false negatives. These benefits make Uncertain〈T 〉 a compelling programming model for modern applications facing the challenge of uncertainty.
A Monadic Probabilistic Language
 In Proceedings of the 2003 ACM SIGPLAN international workshop on Types in languages design and implementation
, 2003
"... Motivated by many practical applications that have to compute in the presence of uncertainty, we propose a monadic probabilistic language based upon the mathematical notion of sampling function. Our language provides a unified representation scheme for probability distributions, enjoys rich expressi ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
Motivated by many practical applications that have to compute in the presence of uncertainty, we propose a monadic probabilistic language based upon the mathematical notion of sampling function. Our language provides a unified representation scheme for probability distributions, enjoys rich expressiveness, and o#ers high versatility in encoding probability distributions. We also develop a novel style of operational semantics called a horizontal operational semantics, under which an evaluation returns not a single outcome but multiple outcomes. We have preliminary evidence that the horizontal operational semantics improves the ordinary operational semantics with respect to both execution time and accuracy in representing probability distributions.
Exemplaric Expressivity of Modal Logics
, 2008
"... This paper investigates expressivity of modal logics for transition systems, multitransition systems, Markov chains, and Markov processes, as coalgebras of the powerset, finitely supported multiset, finitely supported distribution, and measure functor, respectively. Expressivity means that logically ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
This paper investigates expressivity of modal logics for transition systems, multitransition systems, Markov chains, and Markov processes, as coalgebras of the powerset, finitely supported multiset, finitely supported distribution, and measure functor, respectively. Expressivity means that logically indistinguishable states, satisfying the same formulas, are behaviourally indistinguishable. The investigation is based on the framework of dual adjunctions between spaces and logics and focuses on a crucial injectivity property. The approach is generic both in the choice of systems and modalities, and in the choice of a “base logic”. Most of these expressivity results are already known, but the applicability of the uniform setting of dual adjunctions to these particular examples is what constitutes the contribution of the paper.
Deriving Probability Density Functions from Probabilistic Functional Programs
"... Abstract. The probability density function of a probability distribution is a fundamental concept in probability theory and a key ingredient in various widely used machine learning methods. However, the necessary framework for compiling probabilistic functional programs to density functions has only ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The probability density function of a probability distribution is a fundamental concept in probability theory and a key ingredient in various widely used machine learning methods. However, the necessary framework for compiling probabilistic functional programs to density functions has only recently been developed. In this work, we present a density compiler for a probabilistic language with discrete and continuous distributions, and discrete observations, and provide a proof of its soundness. The compiler greatly reduces the development effort of domain experts, which we demonstrate by solving inference problems from various scientific applications, such as modelling the global carbon cycle, using a standard Markov chain Monte Carlo framework. 1
Probabilistic Relations
 School of Computer Science, McGill University, Montreal
, 1998
"... The notion of binary relation is fundamental in logic. What is the correct analogue of this concept in the probabilistic case? I will argue that the notion of conditional probability distribution (Markov kernel, stochastic kernel) is the correct generalization. One can define a category based on sto ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
The notion of binary relation is fundamental in logic. What is the correct analogue of this concept in the probabilistic case? I will argue that the notion of conditional probability distribution (Markov kernel, stochastic kernel) is the correct generalization. One can define a category based on stochastic kernels which has many of the formal properties of the ordinary category of relations. Using this concept I will show how to define iteration in this category and give a simple treatment of Kozen's language of while loops and probabilistic choice. I will use the concept of stochastic relation to introduce some of the ongoing joint work with Edalat and Desharnais on Labeled Markov Processes. In my talk I will assume that people do not know what partially additive categories are but that they do know basic category theory and basic notions like measure and probability. This work is mainly due to Kozen, Giry, Lawvere and others. 1 Introduction The notion of binary relation and relation...
Possibilistic and Probabilistic AbstractionBased Model Checking
 Process Algebra and Probabilistic Methods, Performance Modeling and Veri Second Joint International Workshop PAPMPROBMIV 2002, volume 2399 of Lecture Notes in Computer Science
, 2002
"... models whose verification results transfer to the abstracted models for a logic with unrestricted use of negation and quantification. This framework is novel in that its models have quantitative or probabilistic observables and state transitions. Properties of a quantitative temporal logic have meas ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
models whose verification results transfer to the abstracted models for a logic with unrestricted use of negation and quantification. This framework is novel in that its models have quantitative or probabilistic observables and state transitions. Properties of a quantitative temporal logic have measurable denotations in these models. For probabilistic models such denotations approximate the probabilistic semantics of full LTL. We show how predicatebased abstractions specify abstract quantitative and probabilistic models with finite state space. 1
The Demonic Product of Probabilistic Relations
, 2001
"... The demonic product of two probabilistic relations is defined and investigated. It is shown that the product is stable under bisimulations when the mediating object is probabilistic, and that under some mild conditions the nondeterministic fringe of the probabilistic relations behaves properly: the ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
The demonic product of two probabilistic relations is defined and investigated. It is shown that the product is stable under bisimulations when the mediating object is probabilistic, and that under some mild conditions the nondeterministic fringe of the probabilistic relations behaves properly: the fringe of the product equals the demonic product of the fringes.
A DSL for Explaining Probabilistic Reasoning
 In IFIP Working Conference on DomainSpecific Languages
, 2009
"... Abstract. We propose a new focus in language design where languages provide constructs that not only describe the computation of results, but also produce explanations of how and why those results were obtained. We posit that if users are to understand computations produced by a language, that langu ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a new focus in language design where languages provide constructs that not only describe the computation of results, but also produce explanations of how and why those results were obtained. We posit that if users are to understand computations produced by a language, that language should provide explanations to the user. As an example of such an explanationoriented language we present a domainspecific language for explaining probabilistic reasoning, a domain that is not well understood by nonexperts. We show the design of the DSL in several steps. Based on a storytelling metaphor of explanations, we identify generic constructs for building stories out of events, and obtaining explanations by applying stories to specific examples. These generic constructs are then adapted to the particular explanation domain of probabilistic reasoning. Finally, we develop a visual notation for explaining probabilistic reasoning. 1
Approximating Markov Processes By Averaging
"... Normally, one thinks of probabilistic transition systems as taking an initial probability distribution over the state space into a new probability distribution representing the system after a transition. We, however, take a dual view of Markov processes as transformers of bounded measurable function ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Normally, one thinks of probabilistic transition systems as taking an initial probability distribution over the state space into a new probability distribution representing the system after a transition. We, however, take a dual view of Markov processes as transformers of bounded measurable functions. This is very much in the same spirit as a “predicatetransformer ” view, which is dual to the statetransformer view of transition systems. We redevelop the theory of labelled Markov processes from this view point, in particular we explore approximation theory. We obtain three main results: (i) It is possible to define bisimulation on general measure spaces and show that it is an equivalence relation. The logical characterization of bisimulation can be done straightforwardly and generally. (ii) A new and flexible approach to approximation based on averaging can be given. This vastly generalizes and streamlines the idea of using conditional expectations to compute approximations. (iii) We show that there is a minimal process bisimulationequivalent to a given process, and this minimal process is obtained as the limit of the finite approximants.