Results

**21 - 28**of**28**### Abstract DRAFT

"... Build your own probability monads Probability is often counter-intuitive, and it always involves a great deal of math. This is unfortunate, because many applications in robotics and AI increasingly rely on probability theory. We introduce a modular toolkit for constructing probability monads, and sh ..."

Abstract
- Add to MetaCart

Build your own probability monads Probability is often counter-intuitive, and it always involves a great deal of math. This is unfortunate, because many applications in robotics and AI increasingly rely on probability theory. We introduce a modular toolkit for constructing probability monads, and show that it can be used for everything from discrete distributions to weighted particle filtering. This modular approach allows us to present a single, easy-to-use API for working with many kinds of probability distributions. Our toolkit combines several existing components (the list monad, the Rand monad, and the MaybeT monad transformer), with a stripped down version of WriterT Prob, and a new monad for sequential Monte Carlo sampling. Using these components, we show that MaybeT can be used to implement Bayes ’ theorem. We also show how to implement a monad for weighted particle filtering.

### Testing for Simulation and Bisimulation in Labelled Markov Processes

, 2003

"... This paper presents a fundamental study of similarity and bisimilarity for labelled Markov processes: a particular class of probabilistic labelled transition systems. The main results characterize similarity as a testing preorder and bisimilarity as a testing equivalence. ..."

Abstract
- Add to MetaCart

This paper presents a fundamental study of similarity and bisimilarity for labelled Markov processes: a particular class of probabilistic labelled transition systems. The main results characterize similarity as a testing preorder and bisimilarity as a testing equivalence.

### Pipes and Filters: Modelling a Software Architecture Through Relations

, 2002

"... A pipeline is a popular architecture which connects computational components/filers) through connectors (pipes) so that computations are performed in a stream like fashion. The data are transported through the pipes between filers, gradually transforming inputs to outputs. This kind of stream proces ..."

Abstract
- Add to MetaCart

A pipeline is a popular architecture which connects computational components/filers) through connectors (pipes) so that computations are performed in a stream like fashion. The data are transported through the pipes between filers, gradually transforming inputs to outputs. This kind of stream processing has been made popular through UNIX pipes that serially connect independent components for performing a sequence of tasks. We show in this paper how to formalize this architecture in terms of monads, hereby including relational specifications as special cases. The system is given through a directed acyclic graph the nodes of which carry the computational structure by being labelled with morphisms from the monad, and the edges provide the data for these operations. It is shown how fundamental compositional operations like combining pipes and filers, and refining a system by replacing simple parts through more elaborate ones, are supported through this construction.

### Categories for Imperative Semantics PLDG Seminar

"... The aim of these notes is to provide an introduction to category theory, and a motivation for its use in denotational semantics. I will do this by showing how to apply it to give an abstract semantics to a simple imperative language. These notes are loosely based ..."

Abstract
- Add to MetaCart

The aim of these notes is to provide an introduction to category theory, and a motivation for its use in denotational semantics. I will do this by showing how to apply it to give an abstract semantics to a simple imperative language. These notes are loosely based

### Approximating Markov Processes By Averaging

"... Normally, one thinks of probabilistic transition systems as taking an initial probability distribution over the state space into a new probability distribution representing the system after a transition. We, however, take a dual view of Markov processes as transformers of bounded measurable function ..."

Abstract
- Add to MetaCart

Normally, one thinks of probabilistic transition systems as taking an initial probability distribution over the state space into a new probability distribution representing the system after a transition. We, however, take a dual view of Markov processes as transformers of bounded measurable functions. This is very much in the same spirit as a “predicate-transformer ” view, which is dual to the state-transformer view of transition systems. We redevelop the theory of labelled Markov processes from this view point, in particular we explore approximation theory. We obtain three main results: (i) It is possible to define bisimulation on general measure spaces and show that it is an equivalence relation. The logical characterization of bisimulation can be done straightforwardly and generally. (ii) A new and flexible approach to approximation based on averaging can be given. This vastly generalizes and streamlines the idea of using conditional expectations to compute approximations. (iii) We show that there is a minimal process bisimulation-equivalent to a given process, and this minimal process is obtained as the limit of the finite approximants.

### Causal Theories: A Categorical Perspective on Bayesian Networks

"... It’s been an amazing year, and I’ve had a good time learning and thinking about the contents of this essay. A number of people have had significant causal influence on this. Foremost among these is my dissertation supervisor Jamie Vicary, who has been an excellent guide throughout, patient as I’ve j ..."

Abstract
- Add to MetaCart

It’s been an amazing year, and I’ve had a good time learning and thinking about the contents of this essay. A number of people have had significant causal influence on this. Foremost among these is my dissertation supervisor Jamie Vicary, who has been an excellent guide throughout, patient as I’ve jumped from idea to idea and with my vague questions, and yet careful to ensure I’ve stayed on track. We’ve had some great discussions too, and I thank him for them. John Baez got me started on this general topic, has responded enthusiastically and generously to probably too many questions, and, with the support of the Centre for Quantum Technologies, Singapore, let me come visit him to pester him with more. Bob Coecke has been a wonderful and generous general supervisor, always willing to talk and advise, and has provided many of the ideas that lurk in the background of those here. I thank both of them too. I also thank Rob Spekkens, Dusko Pavlovic, Prakash Panangaden, and Samson Abramsky for some interesting discussions

### The Expectation Monad in Quantum Foundations

"... The expectation monad is introduced abstractly via two composable adjunctions, but concretely captures measures. It turns out to sit in between known monads: on the one hand the distribution and ultrafilter monad, and on the other hand the continuation monad. This expectation monad is used in two pr ..."

Abstract
- Add to MetaCart

The expectation monad is introduced abstractly via two composable adjunctions, but concretely captures measures. It turns out to sit in between known monads: on the one hand the distribution and ultrafilter monad, and on the other hand the continuation monad. This expectation monad is used in two probabilistic analogues of fundamental results of Manes and Gelfand for the ultrafilter monad: algebras of the expectation monad are convex compact Hausdorff spaces, and are dually equivalent to so-called Banach effect algebras. These structures capture states and effects in quantum foundations, and the duality between them. Moreover, the approach leads to a new re-formulation of Gleason’s theorem, expressing that effects on a Hilbert space are free effect modules on projections, obtained via tensoring with the unit interval.