Results 1 
8 of
8
NONCOMPUTABLE CONDITIONAL DISTRIBUTIONS
"... Abstract. We study the computability of conditional probability, a fundamental notion in probability theory and Bayesian statistics. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In more general settings, conditional probability is defined axiomaticall ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Abstract. We study the computability of conditional probability, a fundamental notion in probability theory and Bayesian statistics. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In more general settings, conditional probability is defined axiomatically, and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. However, we show that in general one cannot compute conditional probabilities. Specifically, we construct a pair of computable random variables (X, Y) in the unit interval whose conditional distribution P[YX] encodes the halting problem. Nevertheless, probabilistic inference has proven remarkably successful in practice, even in infinitedimensional continuous settings. We prove several results giving general conditions under which conditional distributions are computable. In the discrete or dominated setting, under suitable computability hypotheses, conditional distributions are computable. Likewise, conditioning is a computable operation in the presence of certain additional structure, such as independent absolutely continuous noise.
Dynamic Enforcement of Knowledgebased Security Policies
"... Abstract—This paper explores the idea of knowledgebased security policies, which are used to decide whether to answer a query over secret data based on an estimation of the querier’s (possibly increased) knowledge given the result. Limiting knowledge is the goal of existing information release poli ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract—This paper explores the idea of knowledgebased security policies, which are used to decide whether to answer a query over secret data based on an estimation of the querier’s (possibly increased) knowledge given the result. Limiting knowledge is the goal of existing information release policies that employ mechanisms such as noising, anonymization, and redaction. Knowledgebased policies are more general: they increase flexibility by not fixing the means to restrict information flow. We enforce a knowledgebased policy by explicitly tracking a model of a querier’s belief about secret data, represented as a probability distribution. We then deny any query that could increase knowledge above a given threshold. We implement query analysis and belief tracking via abstract interpretation using a novel domain we call probabilistic polyhedra, whose design permits trading off precision with performance while ensuring estimates of a querier’s knowledge are sound. Experiments with our implementation show that several useful queries can be handled efficiently, and performance scales far better than would more standard implementations of probabilistic computation based on sampling. I.
Computable de Finetti measures
, 2009
"... We prove a uniformly computable version of de Finetti’s theorem on exchangeable sequences of real random variables. As a consequence, exchangeable stochastic processes in probabilistic functional programming languages can be automatically rewritten as procedures that do not modify nonlocal state. A ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We prove a uniformly computable version of de Finetti’s theorem on exchangeable sequences of real random variables. As a consequence, exchangeable stochastic processes in probabilistic functional programming languages can be automatically rewritten as procedures that do not modify nonlocal state. Along the way, we prove that a distribution on the unit interval is computable if and only if its moments are uniformly computable.
ON THE COMPUTABILITY OF CONDITIONAL PROBABILITY
"... Abstract. We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Abstract. We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of certain additional structure, such as independent absolutely continuous noise. 1.
Brain and Cognitive Sciences
"... We describe a general method of transforming arbitrary programming languages into probabilistic programming languages with straightforward MCMC inference engines. Random choices in the program are “named ” with information about their position in an execution trace; these names are used in conjuncti ..."
Abstract
 Add to MetaCart
We describe a general method of transforming arbitrary programming languages into probabilistic programming languages with straightforward MCMC inference engines. Random choices in the program are “named ” with information about their position in an execution trace; these names are used in conjunction with a database holding values of random variables to implement MCMC inference in the space of execution traces. We encode naming information using lightweight sourcetosource compilers. Our method enables us to reuse existing infrastructure (compilers, profilers, etc.) with minimal additional code, implying fast models with low development overhead. We illustrate the technique on two languages, one functional and one imperative: Bher, a compiled version of the Church language which eliminates interpretive overhead of the original MITChurch implementation, and Stochastic Matlab, a new opensource language. 1
Probabilistic programs, computability, and de Finetti measures
"... The complexity of probabilistic models, especially those involving recursion, has far exceeded the representational capacity of graphical models. Functional programming languages with probabilistic choice operators have recently been proposed as universal representations for statistical modeling (e. ..."
Abstract
 Add to MetaCart
The complexity of probabilistic models, especially those involving recursion, has far exceeded the representational capacity of graphical models. Functional programming languages with probabilistic choice operators have recently been proposed as universal representations for statistical modeling (e.g., IBAL [Pfe01], λ ◦ [PPT08], Church [GMR + 08]). The conditional independence structure of a probabilistic program is not, in general, representable by a graphical model. Rather, it is dynamic and is given by the random control and data flow of the program. These functional probabilistic languages are allied with imperative probabilistic languages (e.g., Infer.NET) and a similar tradition of augmenting logical representations with probabilistic quantifiers (e.g., BLOG [MMR+ 05],
On the computability and complexity of Bayesian reasoning
"... If we consider the claim made by some cognitive scientists that the mind performs Bayesian reasoning, and if we simultaneously accept the Physical ChurchTuring thesis and thus believe that the computational power of the mind is no more than that of a Turing machine, then what limitations are there ..."
Abstract
 Add to MetaCart
If we consider the claim made by some cognitive scientists that the mind performs Bayesian reasoning, and if we simultaneously accept the Physical ChurchTuring thesis and thus believe that the computational power of the mind is no more than that of a Turing machine, then what limitations are there to the reasoning abilities of the mind? I give an overview of joint work with Nathanael Ackerman (Harvard, Mathematics) and Cameron Freer (MIT, CSAIL) that bears on the computability and complexity of Bayesian reasoning. In particular, we prove that conditional probability is in general not computable in the presence of continuous random variables. However, in light of additional structure in the prior distribution, such as the presence of certain types of noise, or of exchangeability, conditioning is possible. These results cover most of statistical practice. At the workshop on Logic and Computational Complexity, we presented results on the computational complexity of conditioning, embedding #Pcomplete problems in the task of computing
Computable de Finetti measures
, 912
"... We prove a computable version of de Finetti’s theorem on exchangeable sequences of real random variables. As a consequence, exchangeable stochastic processes expressed in probabilistic functional programming languages can be automatically rewritten as procedures that do not modify nonlocal state. A ..."
Abstract
 Add to MetaCart
We prove a computable version of de Finetti’s theorem on exchangeable sequences of real random variables. As a consequence, exchangeable stochastic processes expressed in probabilistic functional programming languages can be automatically rewritten as procedures that do not modify nonlocal state. Along the way, we prove that a distribution on the unit interval is computable if and only if its moments are uniformly computable. Key words: de Finetti’s theorem, exchangeability, computable probability theory, probabilistic programming languages, mutation 2010 MSC: 03D78, 60G09, 68Q10, 03F60, 68N18 1.