Results

**1 - 6**of**6**### #P-COMPLETE CONDITIONAL DISTRIBUTIONS

"... Abstract. We study conditional probability from the perspective of complexity theory of functions and operators in analysis, building on work by Ko (1983), Friedman (1984), and Kawamura and Cook (2010). For some random variable X in {0, 1} N whose distribution is continuous and polynomial-time compu ..."

Abstract
- Add to MetaCart

Abstract. We study conditional probability from the perspective of complexity theory of functions and operators in analysis, building on work by Ko (1983), Friedman (1984), and Kawamura and Cook (2010). For some random variable X in {0, 1} N whose distribution is continuous and polynomial-time computable, and some polynomial-time computable function f: {0, 1} N → [0, 1] for which the random variable f(X) is “polynomially-diffuse”, the function taking (integers encoding) A ∈ {0, 1} ∗ , an open rational interval B, and an accuracy 2 −i to a rational within 2 −i of the conditional probability that X ∈ A given f(X) ∈ B is shown to be #P-complete. On the other hand, all such functions computing conditional probabilities are in #P. 1.

### Equational reasoning for conditioning as disintegration

"... Conditional distributions are widely used for practical inference, even when the condition has zero probability (such as setting a continuous variable to an observed value). This popularity contrasts with the scary pitfalls (such as Borel’s paradox) that beset rigorous treatments of conditioning. In ..."

Abstract
- Add to MetaCart

Conditional distributions are widely used for practical inference, even when the condition has zero probability (such as setting a continuous variable to an observed value). This popularity contrasts with the scary pitfalls (such as Borel’s paradox) that beset rigorous treatments of conditioning. In general, conditional expectations may arise that do not correspond to any conditional distribution at all. This gap between theory and practice has made it difficult to automate conditioning calculations soundly. In particular, given a generative model, we want to produce its conditional expectations mechanically, so as to optimize them numerically, simplify them symbolically, and so on. Disintegration [1] is a rigorous approach to conditioning that covers a wide range of applications yet admits intuitive conditional distributions and allows their ‘guilt-free manipulation’. In the present work, we mechanize this approach by adding a ‘Lebesgue measure ’ operation to the usual monadic representation of stochastic experiments [2]. We show how to compute conditional distributions by equating expressions in this representation. By ‘compute’, we mean producing a symbolic algebraic expression usually and giving up in failure occasionally, not approximating a real number to arbitrary precision. In fact, the latter task is impossible for conditional probabilities in general [3]. As for the former task, a system for computing

### On the computability and complexity of Bayesian reasoning

"... If we consider the claim made by some cognitive scientists that the mind performs Bayesian reasoning, and if we simultaneously accept the Physical Church-Turing thesis and thus believe that the computational power of the mind is no more than that of a Turing machine, then what limitations are there ..."

Abstract
- Add to MetaCart

If we consider the claim made by some cognitive scientists that the mind performs Bayesian reasoning, and if we simultaneously accept the Physical Church-Turing thesis and thus believe that the computational power of the mind is no more than that of a Turing machine, then what limitations are there to the reasoning abilities of the mind? I give an overview of joint work with Nathanael Ackerman (Harvard, Mathematics) and Cameron Freer (MIT, CSAIL) that bears on the computability and complexity of Bayesian reasoning. In particular, we prove that conditional probability is in general not computable in the presence of continuous random variables. However, in light of additional structure in the prior distribution, such as the presence of certain types of noise, or of exchangeability, conditioning is possible. These results cover most of statistical practice. At the workshop on Logic and Computational Complexity, we presented results on the computational complexity of conditioning, embedding #P-complete problems in the task of computing

### The Principles and Practice of Probabilistic Programming

"... models, probabilistic programs Probabilities describe degrees of belief, and probabilistic inference describes rational reasoning under uncertainty. It is no wonder, then, that probabilistic models have exploded onto the scene of modern artificial intelligence, cognitive science, and applied statist ..."

Abstract
- Add to MetaCart

models, probabilistic programs Probabilities describe degrees of belief, and probabilistic inference describes rational reasoning under uncertainty. It is no wonder, then, that probabilistic models have exploded onto the scene of modern artificial intelligence, cognitive science, and applied statistics: these are all sciences of inference under uncertainty. But as probabilistic models have become more sophisticated, the tools to formally describe them and to perform probabilistic inference have wrestled with new complexity. Just as programming beyond the simplest algorithms requires tools for abstraction and composition, complex probabilistic modeling requires new progress in model representation—probabilistic programming languages. These languages provide compositional means for describing complex probability distributions; implementations of these languages provide generic inference engines: tools for performing efficient probabilistic

### Grounding Lexical Meaning in Core Cognition

, 2012

"... Author’s note: This document is a slightly updated and reformatted extract from a grant proposal to the ONR. As a proposal, it aims describe useful directions while reviewing existing and pilot work; it has no pretensions to being a systematic, rigorous, or entirely coherent scholarly work. On the o ..."

Abstract
- Add to MetaCart

Author’s note: This document is a slightly updated and reformatted extract from a grant proposal to the ONR. As a proposal, it aims describe useful directions while reviewing existing and pilot work; it has no pretensions to being a systematic, rigorous, or entirely coherent scholarly work. On the other hand, I’ve found that it provides a useful overview of a few ideas on the architecture of natural language that haven’t yet appeared elsewhere. I provide it for those interested, but with all due caveats. Words are potentially one of the clearest windows on human knowledge and conceptual structure. But what do words mean? In this project we aim to construct and explore a formal model of lexical semantics grounded, via pragmatic inference, in core conceptual structures. Flexible human cognition is derived in large part from our ability to imagine possible worlds. A rich set of concepts, intuitive theories, and other mental representations support imagining and reasoning about possible worlds—together we call these core cognition. Here we posit that the collection of core concepts also forms the set of primitive elements available for lexical semantics: word meanings are built from pieces of core cognition. We propose to study lexical semantics in the setting of an architecture for language understanding that integrates literal meaning with pragmatic inference. This architecture supports underspecified and uncertain lexical meaning, leading to subtle interactions between meaning, conceptual structure, and context. We will explore several cases of lexical semantics where these interactions are particularly important: indexicals, scalar adjectives, generics, and modals. We formalize both core cognition and the natural language architecture using the Church probabilistic programming language. In this project we aim to contribute to our understanding of the connection between words and mental representations; from this we expect to gain critical insights into many aspects of psychology, to construct vastly more useful thinking machines, and to interface natural and artificial intelligences more efficiently.

### Computability and analysis: the legacy of Alan Turing

, 2012

"... For most of its history, mathematics was algorithmic in nature. The geometric claims in Euclid’s Elements fall into two distinct categories: “problems, ” which assert that a construction can be carried out to meet a given specification, and “theorems, ” which assert that some property holds of a par ..."

Abstract
- Add to MetaCart

For most of its history, mathematics was algorithmic in nature. The geometric claims in Euclid’s Elements fall into two distinct categories: “problems, ” which assert that a construction can be carried out to meet a given specification, and “theorems, ” which assert that some property holds of a particular geometric