Results 11  20
of
22
Modeling Genome Evolution with a DSEL for Probabilistic Programming
 In 8th Int. Symp. on Practical Aspects of Declarative Languages
, 2006
"... Abstract. Many scientific applications benefit from simulation. However, programming languages used in simulation, such as C++ or Matlab, approach problems from a deterministic procedural view, which seems to differ, in general, from many scientists ’ mental representation. We apply a domainspecifi ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Many scientific applications benefit from simulation. However, programming languages used in simulation, such as C++ or Matlab, approach problems from a deterministic procedural view, which seems to differ, in general, from many scientists ’ mental representation. We apply a domainspecific language for probabilistic programming to the biological field of gene modeling, showing how the mentalmodel gap may be bridged. Our system assisted biologists in developing a model for genome evolution by separating the concerns of model and simulation and providing implicit probabilistic nondeterminism.
From Bayesian notation to pure Racket, via measuretheoretic probability in λZFC
 In: Implementation and Application of Functional Languages
, 2010
"... Abstract. Bayesian practitioners build models of the world without regarding how difficult it will be to answer questions about them. When answering questions, they put off approximating as long as possible, and usually must write programs to compute converging approximations. Writing the programs i ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Bayesian practitioners build models of the world without regarding how difficult it will be to answer questions about them. When answering questions, they put off approximating as long as possible, and usually must write programs to compute converging approximations. Writing the programs is distracting, tedious and errorprone, and we wish to relieve them of it by providing languages and compilers. Their style constrains our work: the tools we provide cannot approximate early. Our approach to meeting this constraint is to 1) determine their notation’s meaning in a suitable theoretical framework; 2) generalize our interpretation in an uncomputable, exact semantics; 3) approximate the exact semantics and prove convergence; and 4) implement the approximating semantics in Racket (formerly PLT Scheme). In this way, we define languages with at least as much exactness as Bayesian practitioners have in mind, and also put off approximating as long as possible. In this paper, we demonstrate the approach using our preliminary work on discrete (countably infinite) Bayesian models.
Abstract DRAFT
"... Build your own probability monads Probability is often counterintuitive, and it always involves a great deal of math. This is unfortunate, because many applications in robotics and AI increasingly rely on probability theory. We introduce a modular toolkit for constructing probability monads, and sh ..."
Abstract
 Add to MetaCart
Build your own probability monads Probability is often counterintuitive, and it always involves a great deal of math. This is unfortunate, because many applications in robotics and AI increasingly rely on probability theory. We introduce a modular toolkit for constructing probability monads, and show that it can be used for everything from discrete distributions to weighted particle filtering. This modular approach allows us to present a single, easytouse API for working with many kinds of probability distributions. Our toolkit combines several existing components (the list monad, the Rand monad, and the MaybeT monad transformer), with a stripped down version of WriterT Prob, and a new monad for sequential Monte Carlo sampling. Using these components, we show that MaybeT can be used to implement Bayes ’ theorem. We also show how to implement a monad for weighted particle filtering.
Stochastic Reasoning in Hybrid Linear Logic
, 2009
"... Ordinary linear implication can represent unconstrained state transition, but stateful systems often operate under temporal and stochastic constraints which impede the use of linear logic as a framework for representing stateful computations. We propose a general modal extension of linear logic wher ..."
Abstract
 Add to MetaCart
Ordinary linear implication can represent unconstrained state transition, but stateful systems often operate under temporal and stochastic constraints which impede the use of linear logic as a framework for representing stateful computations. We propose a general modal extension of linear logic where the worlds represent the constraints, and hybrid connectives combine constraint reasoning with ordinary logical reasoning. Among the merits of this logic is a generic focused sequent calculus that can be used to internalize the rules of particular stateful systems; we illustrate this with a simple adequate encoding of the synchronous stochastic picalculus. 1
Toward Interactive Statistical Modeling
"... When solving machine learning problems, there is currently little automated support for easily experimenting with alternative statistical models or solution strategies. This is because this activity often requires expertise from several different fields (e.g., statistics, optimization, linear algebr ..."
Abstract
 Add to MetaCart
When solving machine learning problems, there is currently little automated support for easily experimenting with alternative statistical models or solution strategies. This is because this activity often requires expertise from several different fields (e.g., statistics, optimization, linear algebra), and the level of formalism required for automation is much higher than for a human solving problems on paper. We present a system toward addressing these issues, which we achieve by (1) formalizing a type theory for probability and optimization, and (2) providing an interactive rewrite system for applying problem reformulation theorems. Automating solution strategies this way enables not only manual experimentation but also higherlevel, automated activities, such as autotuning. Keywords: machine learning, algorithm derivation, interactive modeling, type theory
A Programming Language for Precision/Cost Tradeoffs
, 2009
"... Many computational systems need to deal with various forms of imprecision and uncertainty in their data; it is also the case that many systems, especially mobile and distributed systems, must be able to trade off the precision of their data and operations against the cost of performing those operati ..."
Abstract
 Add to MetaCart
Many computational systems need to deal with various forms of imprecision and uncertainty in their data; it is also the case that many systems, especially mobile and distributed systems, must be able to trade off the precision of their data and operations against the cost of performing those operations. Unfortunately, for many applications, trying to make these tradeoffs severely complicates the program, because there does not yet exist a programming model that gives the programmer the ability to easily describe the relevant tradeoffs between precision and cost of operations or to express in an algorithm what tradeoffs are appropriate under what circumstances. This paper lays a solid foundation for exploring
General Terms
"... There has been great interest in creating probabilistic programming languages to simplify the coding of statistical tasks; however, there still does not exist a formal language that simultaneously provides (1) continuous probability distributions, (2) the ability to naturally express custom probabil ..."
Abstract
 Add to MetaCart
There has been great interest in creating probabilistic programming languages to simplify the coding of statistical tasks; however, there still does not exist a formal language that simultaneously provides (1) continuous probability distributions, (2) the ability to naturally express custom probabilistic models, and (3) probability density functions (PDFs). This collection of features is necessary for mechanizing fundamental statistical techniques. We formalize the first probabilistic language that exhibits these features, and it serves as a foundational framework for extending the ideas to more general languages. Particularly novel are our type system for absolutely continuous (AC) distributions (those which permit PDFs) and our PDF calculation procedure, which calculates PDFs for a large class of AC distributions. Our formalization paves the way toward the rigorous encoding of powerful statistical reformulations.
Bayesian Machine Learning
, 2011
"... Abstract. The Bayesian approach to machine learning amounts to inferring posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expres ..."
Abstract
 Add to MetaCart
Abstract. The Bayesian approach to machine learning amounts to inferring posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a core functional calculus with primitives for sampling prior distributions and observing variables. We define combinators for measure transformers, based on theorems in measure theory, and use these to give a rigorous semantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid measures, and, in particular, for observations of zeroprobability events. We compile our core language to a small imperative language that in addition to the measure transformer semantics also has a straightforward semantics via factor graphs, data structures that enable many efficient inference algorithms. We use an existing inference engine for efficient approximate inference of posterior marginal distributions, treating thousands of observations per second for large instances of realistic models. 1
Abstract
"... In this paper, we present a simple way to write Zstyle specifications of randomized programs. To interpret the written specifications, we propose to use a constructive set theory, called CZ set theory, instead of the classical set theory Z. Since CZ has an interpretation in MartinLöf’s theory of t ..."
Abstract
 Add to MetaCart
In this paper, we present a simple way to write Zstyle specifications of randomized programs. To interpret the written specifications, we propose to use a constructive set theory, called CZ set theory, instead of the classical set theory Z. Since CZ has an interpretation in MartinLöf’s theory of types, it enables us to derive probabilistic programs from correctness proofs of the written specifications. In this work, we give the elementary ideas, and the details will be covered in future work. 1