Results 1 
7 of
7
Inductive influence
 British Journal for the Philosophy of Science
"... Objective Bayesianism has been criticised for not allowing learning from experience: it is claimed that an agent must give degree of belief 1 to the next raven being black, however many other black ravens have 2 been observed. I argue that this objection can be overcome by appealing to objective Bay ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
Objective Bayesianism has been criticised for not allowing learning from experience: it is claimed that an agent must give degree of belief 1 to the next raven being black, however many other black ravens have 2 been observed. I argue that this objection can be overcome by appealing to objective Bayesian nets, a formalism for representing objective Bayesian degrees of belief. Under this account, previous observations exert an inductive influence on the next observation. I show how this approach can be used to capture the JohnsonCarnap continuum of inductive methods, as well as the NixParis continuum, and show how inductive influence can
Objective Bayesianism with predicate languages. Synthese
, 2008
"... Objective Bayesian probability is often defined over rather simple domains, e.g., finite event spaces or propositional languages. This paper investigates the extension of objective Bayesianism to firstorder logical languages. It is argued that the objective Bayesian should choose a probability func ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
Objective Bayesian probability is often defined over rather simple domains, e.g., finite event spaces or propositional languages. This paper investigates the extension of objective Bayesianism to firstorder logical languages. It is argued that the objective Bayesian should choose a probability function, from all those that satisfy constraints imposed by background knowledge, that is closest to a particular frequencyinduced probability function which generalises the λ = 0 function of Carnap’s continuum of inductive methods.
A note on binary inductive logic
 JOURNAL OF PHILOSOPHICAL LOGIC
, 2007
"... We consider the problem of induction over languages containing binary relations and outline a way of interpreting and constructing a class of probability functions on the sentences of such a language. Some principles of inductive reasoning satisfied by these probability functions are discussed, lead ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We consider the problem of induction over languages containing binary relations and outline a way of interpreting and constructing a class of probability functions on the sentences of such a language. Some principles of inductive reasoning satisfied by these probability functions are discussed, leading in turn to a representation theorem for a more general class of probability functions satisfying these principles.
Probabilistic logic and induction
 J. of Logic and Computation
"... We give a probabilistic interpretation of firstorder formulas based on Valiants model of paclearning. We study the resulting notion of probabilistic or approximate truth and take some first steps in developing its model theory. In particular we show that every fixed error parameter determining the ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We give a probabilistic interpretation of firstorder formulas based on Valiants model of paclearning. We study the resulting notion of probabilistic or approximate truth and take some first steps in developing its model theory. In particular we show that every fixed error parameter determining the precision of universal quantification gives rise to a different class of tautologies. Finally we study the inductive inference of firstorder formulas from atomic truths. 1 introduction The goal of this paper is to develop a notion of model theoretic paclearning and to study the corresponding notion of probabilistic truth. This parallels the fact that Golds model of language learning [5] can be transformed to a more general modeltheoretic one (Osherson et al. [12], see also Terwijn [13]). This has already yielded some interesting results, e.g. connections with the theory of belief revision (Martin and Osherson [11]). The model of paclearning was introduced by Valiant [15]. This model was the first probabilistic model of learning amenable to a complexity theoretic analysis of learning tasks, and in the subsequent years became one of the most prominent models in the learning theory research. A good introduction to the theory of this model is Kearns and Vazirani [8]. The connections between logic and probability are old and manifold. An early critic of the use of universal statements outside of the synthetic realm of mathematics was the sceptic Sextus Empiricus (2nd–3rd century). He pointed out that without a formal context, where a universal statement can hold by definition, such a statement can only be true when every instance
A Characterization of the Language Invariant Families satisfying Spectrum Exchangeability in Polyadic Inductive Logic
, 2008
"... A necessary and sufficient condition in terms of a de Finetti style representation is given for a probability function in Polyadic Inductive Logic to satisfy being part of a Language Invariant family satisfying Spectrum Exchangeability. This theorem is then considered in relation to the unary Carnap ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A necessary and sufficient condition in terms of a de Finetti style representation is given for a probability function in Polyadic Inductive Logic to satisfy being part of a Language Invariant family satisfying Spectrum Exchangeability. This theorem is then considered in relation to the unary Carnap and NixParis Continua.
ISSN 17499097A Note on Binary Inductive Logic
, 2008
"... Reports available from: And by contacting: ..."