Results 1  10
of
11
Statistical Foundations for Default Reasoning
, 1993
"... We describe a new approach to default reasoning, based on a principle of indifference among possible worlds. We interpret default rules as extreme statistical statements, thus obtaining a knowledge base KB comprised of statistical and firstorder statements. We then assign equal probability to all w ..."
Abstract

Cited by 45 (8 self)
 Add to MetaCart
We describe a new approach to default reasoning, based on a principle of indifference among possible worlds. We interpret default rules as extreme statistical statements, thus obtaining a knowledge base KB comprised of statistical and firstorder statements. We then assign equal probability to all worlds consistent with KB in order to assign a degree of belief to a statement '. The degree of belief can be used to decide whether to defeasibly conclude '. Various natural patterns of reasoning, such as a preference for more specific defaults, indifference to irrelevant information, and the ability to combine independent pieces of evidence, turn out to follow naturally from this technique. Furthermore, our approach is not restricted to default reasoning; it supports a spectrum of reasoning, from quantitative to qualitative. It is also related to other systems for default reasoning. In particular, we show that the work of [ Goldszmidt et al., 1990 ] , which applies maximum entropy ideas t...
From Statistics to Beliefs
, 1992
"... An intelligent agent uses known facts, including statistical knowledge, to assign degrees of belief to assertions it is uncertain about. We investigate three principled techniques for doing this. All three are applications of the principle of indifference, because they assign equal degree of belief ..."
Abstract

Cited by 43 (12 self)
 Add to MetaCart
An intelligent agent uses known facts, including statistical knowledge, to assign degrees of belief to assertions it is uncertain about. We investigate three principled techniques for doing this. All three are applications of the principle of indifference, because they assign equal degree of belief to all basic "situations " consistent with the knowledge base. They differ because there are competing intuitions about what the basic situations are. Various natural patterns of reasoning, such as the preference for the most specific statistical data available, turn out to follow from some or all of the techniques. This is an improvement over earlier theories, such as work on direct inference and reference classes, which arbitrarily postulate these patterns without offering any deeper explanations or guarantees of consistency. The three methods we investigate have surprising characterizations: there are connections to the principle of maximum entropy, a principle of maximal independence, an...
Decision Making with Belief Functions: Compatibility and Incompatibility with the SureThing Principle
 JOURNAL OF RISK AND UNCERTAINTY, 8:255271 (1994) 9 1994
, 1994
"... This article studies situations in which information is ambiguous and only part of it can be probabilized. It is shown that the information can be modeled through belief functions if and only if the nonprobabilizable information is subject to the principles of complete ignorance. Next the representa ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
This article studies situations in which information is ambiguous and only part of it can be probabilized. It is shown that the information can be modeled through belief functions if and only if the nonprobabilizable information is subject to the principles of complete ignorance. Next the representability of decisions by belief functions on outcomes is justified by means of a neutrality axiom. The natural weakening of Savage's surething principle to unambiguous events is examined and its implications for decision making are identified.
Asymptotic Conditional Probabilities for FirstOrder Logic
 In Proc. 24th ACM Symp. on Theory of Computing
, 1992
"... Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder formulas. That is, given firstorder formulas ' and `, we consider the number of structures with domain f1; : : : ; Ng that satisfy `, and comput ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder formulas. That is, given firstorder formulas ' and `, we consider the number of structures with domain f1; : : : ; Ng that satisfy `, and compute the fraction of them in which ' is true. We then consider what happens to this probability as N gets large. This is closely connected to the work on 01 laws that considers the limiting probability of firstorder formulas, except that now we are considering asymptotic conditional probabilities. Although work has been done on special cases of asymptotic conditional probabilities, no general theory has been developed. This is probably due in part to the fact that it has been known that, if there is a binary predicate symbol in the vocabulary, asymptotic conditional probabilities do not always exist. We show that in this general case, almost all the questions one might want to ask (such as d...
Asymptotic Conditional Probabilities: The Nonunary Case
 J. SYMBOLIC LOGIC
, 1993
"... Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder sentences. Given firstorder sentences ' and `, we consider the structures with domain f1; : : : ; Ng that satisfy `, and compute the fraction ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for firstorder sentences. Given firstorder sentences ' and `, we consider the structures with domain f1; : : : ; Ng that satisfy `, and compute the fraction of them in which ' is true. We then consider what happens to this fraction as N gets large. This extends the work on 01 laws that considers the limiting probability of firstorder sentences, by considering asymptotic conditional probabilities. As shown by Liogon'kii [Lio69], if there is a nonunary predicate symbol in the vocabulary, asymptotic conditional probabilities do not always exist. We extend this result to show that asymptotic conditional probabilities do not always exist for any reasonable notion of limit. Liogon'kii also showed that the problem of deciding whether the limit exists is undecidable. We analyze the complexity of three problems with respect to this limit: deciding whether it is welldefined, whether it exists, and whether it lies in some nontrivial interval. Matching upper and lower bounds are given for all three problems, showing them to be highly undecidable.
Bayesian Inference with Missing Data Using Bound and Collapse
 Journal of Computational and Graphical Statistics
, 1997
"... Current Bayesian methods to estimate conditional probabilities from samples with missing data pose serious problems of robustness and computational efficiency. This paper introduces a new method, called Bound and Collapse (bc), able to overcome these problems. When no information is available on the ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Current Bayesian methods to estimate conditional probabilities from samples with missing data pose serious problems of robustness and computational efficiency. This paper introduces a new method, called Bound and Collapse (bc), able to overcome these problems. When no information is available on the pattern of missing data, bc returns bounds on the possible estimates consistent with the available information. These bounds can be then collapsed to a point estimate using information about the pattern of missing data, if any. Approximations of the variance and of the posterior distribution are proposed, and their accuracy is compared to approximations based on alternative methods in a real data set of polling data subject to nonresponse. Keywords: Bayesian Estimates; Bound and Collapse; Gibbs Sampling; Ignorability; Imputation; Missing Data. Reference: KMi Technical Report KMiTR58, December 1997. Address: Paola Sebastiani, Department of Actuarial Science and Statistics, City Universi...
A Response to "Believing on the basis of evidence"
, 1994
"... This paper is essentially identical to one that appears in Computational Intelligence ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper is essentially identical to one that appears in Computational Intelligence
FROM DESCARTES TO TURING: THE COMPUTATIONAL CONTENT OF SUPERVENIENCE
"... Mathematics can provide precise formulations of relatively vague concepts and problems from the real world, and bring out underlying structure common to diverse scientific areas. Sometimes very natural mathematical concepts lie neglected and not widely understood for many years, before their fundame ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Mathematics can provide precise formulations of relatively vague concepts and problems from the real world, and bring out underlying structure common to diverse scientific areas. Sometimes very natural mathematical concepts lie neglected and not widely understood for many years, before their fundamental relevance is recognised and their explanatory power is fully exploited. The notion of definability in a structure is such a concept, and Turing’s [77] 1939 model of interactive computation provides a fruitful context in which to exercise the usefulness of definability as a powerful and widely applicable source of understanding. In this article we set out to relate this simple idea to one of the oldest and apparently least scientifically approachable of problems — that of realistically modelling how mental properties supervene on physical ones. Mathematics can provide precise formulations of relatively vague concepts and problems from the real world, and bring out underlying structure common to diverse scientific areas. Sometimes very natural mathematical concepts lie neglected and not widely understood for many years, before their fundamental relevance is recognised and their explanatory power is fully exploited. Previously we have argued that the notion of definability in a structure is such a concept, and pointed to Turing’s [77] 1939 model of interactive computation as providing a fruitful context in which to exercise the usefulness of definability as a powerful and widely applicable source of understanding. Below, we relate this simple idea to one of the oldest and apparently least scientifically approachable of problems — that of realistically modelling how mental properties supervene on physical ones. We will first briefly review the origins with René Descartes of mindbody dualism, and the problem of mental causation. We will then summarise the subsequent difficulties encountered, and their current persistence, and the more recent usefulness of the concept of supervenience in
Incomputability, Emergence and the Turing Universe
"... Amongst the huge literature concerning emergence, reductionism and mechanism, there is a role for analysis of the underlying mathematical constraints. Much of the speculation, confusion, controversy and descriptive verbiage might be clarified via suitable modelling and theory. The key ingredients we ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Amongst the huge literature concerning emergence, reductionism and mechanism, there is a role for analysis of the underlying mathematical constraints. Much of the speculation, confusion, controversy and descriptive verbiage might be clarified via suitable modelling and theory. The key ingredients we bring to this project are the mathematical notions of definability and invariance, a computability theoretic framework in a realworld context, and within that, the modelling of basic causal environments via Turing’s 1939 notion of interactive computation over a structure described in terms of reals. Useful outcomes are: a refinement of what one understands to be a causal relationship, including nonmechanistic, irreversible causal relationships; an appreciation of how the mathematically simple origins of incomputability in definable hierarchies are materialised in the real world; and an understanding of the powerful explanatory role of current computability theoretic developments. The theme of this article concerns the way in which mathematics can structure everyday discussions around a range of important issues — and can also reinforce intuitions about theoretical links between different aspects of the real world. This fits with the widespread sense of excitement and expectation felt in many fields — and of a corresponding confusion — and of a tension characteristic of a Kuhnian paradigm shift. What we have below can be seen as tentative steps towards the sort of mathematical modelling needed for such a shift to be completed. In section 1, we outline the decisive role mathematics played in the birth of modern science; and how, more recently, it has helped us towards a better understanding of the nature and limitations of the scientific enterprise. In section 2, we review how the mathematics brings out inherent contradictions in the Laplacian model of scientific activity. And we look at some of the approaches to dealing
On the calculating power of Laplace’s demon (Part I)
, 2006
"... We discuss several ways of making precise the informal concept of physical determinism, drawing on ideas from mathematical logic and computability theory. We outline a programme of investigating these notions of determinism in detail for specific, precisely articulated physical theories. We make a s ..."
Abstract
 Add to MetaCart
We discuss several ways of making precise the informal concept of physical determinism, drawing on ideas from mathematical logic and computability theory. We outline a programme of investigating these notions of determinism in detail for specific, precisely articulated physical theories. We make a start on our programme by proposing a general logical framework for describing physical theories, and analysing several possible formulations of a simple Newtonian theory from the point of view of determinism. Our emphasis throughout is on clarifying the precise physical and metaphysical assumptions that typically underlie a claim that some physical theory is ‘deterministic’. A sequel paper is planned, in which we shall apply similar methods to the analysis of other physical theories. Along the way, we discuss some possible repercussions of this kind of investigation for both physics and logic. 1