## Objective Bayesianism with predicate languages. Synthese (2008)

Citations: | 5 - 5 self |

### BibTeX

@MISC{Williamson08objectivebayesianism,

author = {Jon Williamson},

title = {Objective Bayesianism with predicate languages. Synthese},

year = {2008}

}

### OpenURL

### Abstract

Objective Bayesian probability is often defined over rather simple domains, e.g., finite event spaces or propositional languages. This paper investigates the extension of objective Bayesianism to first-order logical languages. It is argued that the objective Bayesian should choose a probability function, from all those that satisfy constraints imposed by background knowledge, that is closest to a particular frequency-induced probability function which generalises the λ = 0 function of Carnap’s continuum of inductive methods.

### Citations

3044 |
Convergence of Probability Measures
- Billingsley
- 1999
(Show Context)
Citation Context ...a probability measure can be defined over this field. Every finitely additive 7 I am grateful to Jürgen Landes for this last point. 12sprobability measure defined on C0 is in fact countably additive (=-=Billingsley, 1979-=-, Theorem 2.3), determined by its values on the thin cylinders {ε ∈ Ω : (ε1, . . . , εn) = H} where H ∈ {0, 1} n , and uniquely extendible to the sigma field C generated by C0 (Billingsley, 1979, Theo... |

1289 |
Judgment under uncertainty: heuristics and biases
- Kahneman, Slovic, et al.
- 1982
(Show Context)
Citation Context ...en be rearranged to construct a ball twice as large as the original (see, e.g., Wagon, 1985). Further examples of probabilistic violations of intuition abound in the psychology literature (see, e.g., =-=Tversky and Kahneman, 1974-=-). So if intuition is to be a deciding factor then it decides against all flavours of Bayesianism, and indeed against the use of probability at all. Before proceeding to languages with more than one p... |

1250 | On information and sufficiency - Kullback, Leibler - 1951 |

364 | A Treatise on Probability
- Keynes
- 1952
(Show Context)
Citation Context ...ximally equivocal probability function p = . However there may be no unique maximally equivocal probability function. There may be more than one way to equivocate, as witnessed by Bertrand’s paradox (=-=Keynes, 1921-=-, §4.7; Gillies, 2000, pp. 37–49). Or there may be a unique way to equivocate but one which is not representable using a probability function—this is case with the problem of improper priors encounter... |

265 | Logical foundations of probability - Carnap - 1950 |

164 |
The selection of prior distributions by formal rules
- Kass, Wasserman
- 1996
(Show Context)
Citation Context ...ere may be a unique way to equivocate but one which is not representable using a probability function—this is case with the problem of improper priors encountered by objective Bayesian statisticians (=-=Kass and Wasserman, 1996-=-, §4.2). In this latter situation there are typically many probability functions that are closest to the improper equivocator. This non-uniqueness is not a show-stopper for objective Bayesianism; it j... |

90 | The continuum of inductive methods - Carnap - 1952 |

78 |
The Well-Calibrated Bayesian
- Dawid
- 1982
(Show Context)
Citation Context ...ard; a subjective Bayesian must give probability 1 to the proposition that in the long run her degrees of belief will tend to become perfectly calibrated with frequency, however ridiculous her prior (=-=Dawid, 1982-=-); probabilistic considerations imply that a solid ball can be taken apart into finitely many pieces which can then be rearranged to construct a ball twice as large as the original (see, e.g., Wagon, ... |

64 |
2000]: Philosophical Theories of Probability
- Gillies
(Show Context)
Citation Context ...obability function p = . However there may be no unique maximally equivocal probability function. There may be more than one way to equivocate, as witnessed by Bertrand’s paradox (Keynes, 1921, §4.7; =-=Gillies, 2000-=-, pp. 37–49). Or there may be a unique way to equivocate but one which is not representable using a probability function—this is case with the problem of improper priors encountered by objective Bayes... |

62 |
The uncertain reasoner’s companion
- Paris
- 1994
(Show Context)
Citation Context ...ion over the whole language is uniquely determined by its values on the quantifier-free sentences of the language, which are in turn determined by its values on the state descriptions (Gaifman, 1964; =-=Paris, 1994-=-, Chapter 11), because as mentioned above a probability measure on the sigma field C of cylinders is determined by the probabilities of the thin cylinders. Since the probability space remains the same... |

53 |
The Banach–Tarski paradox
- Wagon
(Show Context)
Citation Context ...d, 1982); probabilistic considerations imply that a solid ball can be taken apart into finitely many pieces which can then be rearranged to construct a ball twice as large as the original (see, e.g., =-=Wagon, 1985-=-). Further examples of probabilistic violations of intuition abound in the psychology literature (see, e.g., Tversky and Kahneman, 1974). So if intuition is to be a deciding factor then it decides aga... |

51 | 2005a]: Bayesian Nets and Causality: Philosophical and Computational Foundations - WILLIAMSON |

50 |
Concerning measures in first order calculi
- Gaifman
- 1964
(Show Context)
Citation Context ...obability function over the whole language is uniquely determined by its values on the quantifier-free sentences of the language, which are in turn determined by its values on the state descriptions (=-=Gaifman, 1964-=-; Paris, 1994, Chapter 11), because as mentioned above a probability measure on the sigma field C of cylinders is determined by the probabilities of the thin cylinders. Since the probability space rem... |

32 | Probability: The deductive and inductive problems - Johnson - 1932 |

24 | Motivating objective Bayesianism: From empirical constraints to objective probabilities - Williamson |

21 |
Bayesian conditionalisation and the principle of minimum information
- Williams
- 1980
(Show Context)
Citation Context ...rning the new evidence imposes the constraint that the evidence should be fully believed and no further constraints, then the maximum entropy update will agree with the results of conditionalisation (=-=Williams, 1980-=-). Thus there is no need for a separate diachronic principle like conditionalisation. Without such a principle, objective Bayesians have certain freedoms not enjoyed by subjectivists: one’s language c... |

13 | in Objective Bayesian Nets - Williamson |

12 |
Probability logic
- Williamson
- 2002
(Show Context)
Citation Context ...εn) = H} where H ∈ {0, 1} n , and uniquely extendible to the sigma field C generated by C0 (Billingsley, 1979, Theorem 3.1). Any sentence θ in our extended language corresponds to a cylinder Cθ in C (=-=Williamson, 2002-=-, §2). If the elementary outcomes are thought of as truth valuations of the atomic sentences αi, then Cθ is the set of valuations under which θ is true. Given a probability measure p on C, one can def... |

10 | Some observations on induction in predicate probabilistic reasoning - Hill, Paris, et al. - 2002 |

9 | Probabilistic induction in the predicate calculus - Nix - 2005 |

9 | Inductive influence - Williamson |

8 | Jaynes’s maximum entropy prescription and probability theory - Friedman, Shimony - 1971 |

8 |
The emergence of reasons conjecture
- Paris, Vencovská
- 2003
(Show Context)
Citation Context ...is determines a countable propositional language; one can then apply maxent to finite subsets of this language and take limits to extend the resulting probability function to the language as a whole (=-=Paris and Vencovská, 2003-=-). Unfortunately when there is no background knowledge the resulting probability function (which corresponds to λ = ∞ in Carnap’s continuum of inductive methods) suffers from an inability to capture l... |

8 | Philosophies of probability: Objective Bayesianism and its challenges
- Williamson
(Show Context)
Citation Context ...o delimit background knowledge and render it precise, and how to determine the constraints imposed on degrees of belief by precisely-formulated background knowledge. It also faces its own challenges (=-=Williamson, 2008-=-)—e.g., how to compute the objective Bayesian probability function (objective Bayesian nets may be of some help here—see Williamson (2005b)). But objective Bayesianism has certain advantages over othe... |

7 | Countable additivity and subjective probability
- Williamson
- 1999
(Show Context)
Citation Context ...ust as defined above, as is the measure of distance between probability functions. Note that an agent’s degrees of belief should satisfy all the axioms of probability, including countable additivity (=-=Williamson, 1999-=-). Now limiting frequency does not necessarily satisfy countable additivity: suppose that unary predicates R1, R2, . . . are mutually exclusive and exhaustive and Ri holds only of ai; then freq ∞(Ri) ... |

6 | A note on binary inductive logic - Nix, Paris - 2007 |

6 | Why I am not an objective Bayesian. Theory and Decision - Seidenfeld - 1979 |

5 | A critique of Jaynes’ maximum entropy principle - Dias, Shimony - 1981 |

3 | A note on state-descriptions - Bar-Hillel - 1951 |

2 |
The problem of relations in inductive logic
- Carnap
- 1951
(Show Context)
Citation Context ...belief, perhaps via a default constraint of the form p(Ra) = 1/2. More generally, relations often have some structure—e.g., transitivity, irreflexivity—that is known from the outset (BarHillel, 1951; =-=Carnap, 1951-=-). This information must be taken into account when determining objective Bayesian degrees of belief, by imposing the appropriate constraints and choosing the probability function, from all those that... |

2 | The principle of conformity and spectrum exchangeability - Landes, Paris, et al. - 2007 |

2 | From unary to binary inductive logic - Paris, Vencovská - 2007 |

2 | Binary induction and Carnap’s continuum - Vencovská - 2006 |