Results 1  10
of
29
Probabilistically accurate program transformations
 In SAS
, 2011
"... Abstract. The standard approach to program transformation involves the use of discrete logical reasoning to prove that the transformation does not change the observable semantics of the program. We propose a new approach that, in contrast, uses probabilistic reasoning to justify the application of t ..."
Abstract

Cited by 24 (12 self)
 Add to MetaCart
Abstract. The standard approach to program transformation involves the use of discrete logical reasoning to prove that the transformation does not change the observable semantics of the program. We propose a new approach that, in contrast, uses probabilistic reasoning to justify the application of transformations that may change, within probabilistic accuracy bounds, the result that the program produces. Our new approach produces probabilistic guarantees of the form P(D  ≥ B) ≤ ɛ, ɛ ∈ (0, 1), where D is the difference between the results that the transformed and original programs produce, B is an acceptability bound on the absolute value of D, and ɛ is the maximum acceptable probability of observing large D. We show how to use our approach to justify the application of loop perforation (which transforms loops to execute fewer iterations) to a set of computational patterns. 1
Proofs of randomized algorithms in Coq
 Sci. Comput. Program
"... Abstract. Randomized algorithms are widely used either for finding efficiently approximated solutions to complex problems, for instance primality testing, or for obtaining good average behavior, for instance in distributed computing. Proving properties of such algorithms requires subtle reasoning bo ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Abstract. Randomized algorithms are widely used either for finding efficiently approximated solutions to complex problems, for instance primality testing, or for obtaining good average behavior, for instance in distributed computing. Proving properties of such algorithms requires subtle reasoning both on algorithmic and probabilistic aspects of the programs. Providing tools for the mechanization of reasoning is consequently an important issue. Our paper presents a new method for proving properties of randomized algorithms in a proof assistant based on higherorder logic. It is based on the monadic interpretation of randomized programs as probabilistic distribution [18]. It does not require the definition of an operational semantics for the language nor the development of a complex formalization of measure theory, but only use functionals and algebraic properties of the unit interval. Using this model, we show the validity of general rules for estimating the probability for a randomized algorithm to satisfy certain properties, in particular in the case of general recursive functions. We apply this theory for formally proving a program implementing a Bernoulli distribution from a coin flip and the termination of a random walk. All the theories and results presented in this paper have been fully formalized and proved in the Coq proof assistant [19]. 1
Probabilistic Functional Programming in Haskell
 Journal of Functional Programming
"... At the heart of functional programming rests the principle of referential transparency, which in particular means that a function f applied to a value x always yields one and the same value y = f(x). This principle seems to be violated when contemplating the use of functions to describe probabilisti ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
At the heart of functional programming rests the principle of referential transparency, which in particular means that a function f applied to a value x always yields one and the same value y = f(x). This principle seems to be violated when contemplating the use of functions to describe probabilistic events, such as rolling a
Probabilistic Modelling, Inference and Learning using Logical Theories
"... This paper provides a study of probabilistic modelling, inference and learning in a logicbased setting. We show how probability densities, being functions, can be represented and reasoned with naturally and directly in higherorder logic, an expressive formalism not unlike the (informal) everyday l ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
This paper provides a study of probabilistic modelling, inference and learning in a logicbased setting. We show how probability densities, being functions, can be represented and reasoned with naturally and directly in higherorder logic, an expressive formalism not unlike the (informal) everyday language of mathematics. We give efficient inference algorithms and illustrate the general approach with a diverse collection of applications. Some learning issues are also considered.
Natively probabilistic computing
, 2009
"... I introduce a new set of natively probabilistic computing abstractions, including probabilistic generalizations of Boolean circuits, backtracking search and pure Lisp. I show how these tools let one compactly specify probabilistic generative models, generalize and parallelize widely used sampling al ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
I introduce a new set of natively probabilistic computing abstractions, including probabilistic generalizations of Boolean circuits, backtracking search and pure Lisp. I show how these tools let one compactly specify probabilistic generative models, generalize and parallelize widely used sampling algorithms like rejection sampling and Markov chain Monte Carlo, and solve difficult Bayesian inference problems. I first introduce Church, a probabilistic programming language for describing probabilistic generative processes that induce distributions, which generalizes Lisp, a language for describing deterministic procedures that induce functions. I highlight the ways randomness meshes with the reflectiveness of Lisp to support the representation of structured, uncertain knowledge, including nonparametric Bayesian models from the current literature, programs for decision making under uncertainty, and programs that learn very simple programs from data. I then introduce systematic stochastic search, a recursive algorithm for exact and approximate sampling that generalizes a
Formalization of Continuous Probability Distributions
 In Conference on Automated Deduction, volume 4603 of LNAI
, 2007
"... In order to overcome the limitations of stateoftheart simulation based probabilistic analysis, we propose to perform probabilistic analysis within the environment of a higherorderlogic theorem prover. The foremost requirement for conducting such analysis is the formalization of probability dist ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
In order to overcome the limitations of stateoftheart simulation based probabilistic analysis, we propose to perform probabilistic analysis within the environment of a higherorderlogic theorem prover. The foremost requirement for conducting such analysis is the formalization of probability distributions. In this report, we present a methodology for the formalization of continuous probability distributions for which the inverse of the cumulative distribution function can be expressed in a closed mathematical form. Our methodology is primarily based on the formalization of the Standard Uniform random variable, cumulative distribution function properties and the Inverse Transform method. The report presents all this formalization using the HOL theorem prover. In order to illustrate the practical effectiveness of our methodology, the formalization of a few continuous probability distributions has also been included. 1 1
Randomworld semantics and syntactic independence for expressive languages
, 2008
"... We consider three desiderata for a language combining logic and probability: logical expressivity, randomworld semantics, and the existence of a useful syntactic condition for probabilistic independence. Achieving these three desiderata simultaneously is nontrivial. Expressivity can be achieved by ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
We consider three desiderata for a language combining logic and probability: logical expressivity, randomworld semantics, and the existence of a useful syntactic condition for probabilistic independence. Achieving these three desiderata simultaneously is nontrivial. Expressivity can be achieved by using a formalism similar to a programming language, but standard approaches to combining programming languages with probabilities sacrifice randomworld semantics. Naive approaches to restoring randomworld semantics undermine syntactic independence criteria. Our main result is a syntactic independence criterion that holds for a broad class of highly expressive logics under randomworld semantics. We explore various examples including Bayesian networks, probabilistic contextfree grammars, and an example from Mendelian genetics. Our independence criterion supports a casefactor inference technique that reproduces both variable elimination for BNs and the inside algorithm for PCFGs.
Formalization of Standard Uniform Random Variable
 Theoretical Computer Science
, 2006
"... Continuous random variables are widely used to mathematically describe random phenomenon in engineering and physical sciences. In this paper, we present a higherorder logic formalization of the Standard Uniform random variable. We show the correctness of this specification by proving the correspond ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
Continuous random variables are widely used to mathematically describe random phenomenon in engineering and physical sciences. In this paper, we present a higherorder logic formalization of the Standard Uniform random variable. We show the correctness of this specification by proving the corresponding probability distribution properties within the HOL theorem prover and the proof steps have been summarized. This formalized Standard Uniform random variable can be transformed to formalize other continuous random variables, such as Uniform, Exponential, Normal, etc., by using various Nonuniform random number generation techniques. The formalization of these continuous random variables will enable us to perform error free probabilistic analysis of systems within the framework of a higherorderlogic theorem prover. For illustration purposes, we present the formalization of the Continuous Uniform random variable based on our Standard Uniform random variable and then utilize it to perform a simple probabilistic analysis of roundoff error in HOL. 1 1
Measure Transformer Semantics for Bayesian Machine Learning
"... Abstract. The Bayesian approach to machine learning amounts to inferring posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expres ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. The Bayesian approach to machine learning amounts to inferring posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a core functional calculus with primitives for sampling prior distributions and observing variables. We define combinators for measure transformers, based on theorems in measure theory, and use these to give a rigorous semantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid measures, and, in particular, for observations of zeroprobability events. We compile our core language to a small imperative language that has a straightforward semantics via factor graphs, data structures that enable many efficient inference algorithms. We use an existing inference engine for efficient approximate inference of posterior marginal distributions, treating thousands of observations per second for large instances of realistic models. 1
Propagation Networks: A Flexible and Expressive Substrate for Computation
, 2009
"... In this dissertation I propose a shift in the foundations of computation. Modern programming systems are not expressive enough. The traditional image of a single computer that has global effects on a large memory is too restrictive. The propagation paradigm replaces this with computing by networks o ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
In this dissertation I propose a shift in the foundations of computation. Modern programming systems are not expressive enough. The traditional image of a single computer that has global effects on a large memory is too restrictive. The propagation paradigm replaces this with computing by networks of local, independent, stateless machines interconnected with stateful storage cells. In so doing, it offers great flexibility and expressive power, and has therefore been much studied, but has not yet been tamed for generalpurpose computation. The novel insight that should finally permit computing with generalpurpose propagation is that a cell should not be seen as storing a value, but as accumulating information about a value. Various forms of the general idea of propagation have been used with great success for various special purposes; perhaps the most immediate example is constraint propagation in constraint satisfaction systems. This success is evidence both