Results 1 
7 of
7
Information, relevance, and social decisionmaking: Some principles and results of decisiontheoretic semantics
 Logic, Language, and Computation
, 1999
"... Abstract. I propose to treat natural language semantics as a branch of pragmatics, identified in the way of C.S. Peirce, F.P. Ramsey, and R. Carnap as decisiontheory. The notion of relevance plays a key role. It is explicated traditionally, distinguished from a recent homophone, and applied in its ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
Abstract. I propose to treat natural language semantics as a branch of pragmatics, identified in the way of C.S. Peirce, F.P. Ramsey, and R. Carnap as decisiontheory. The notion of relevance plays a key role. It is explicated traditionally, distinguished from a recent homophone, and applied in its natural framework of issuebased communication. Empirical emphasis is on implicature and presupposition. Several theorems are stated and made use of. Items analyzed include ‘or’, ‘not’, ‘but’, ‘even’, and ‘also’. I conclude on parts of mind. This paper submits an approach to meaning, with a focus on broadly nontruthconditional aspects of natural language. Semantics is treated as a branch of pragmatics, identified as decisiontheory in the way of C.S. Peirce, F.P. Ramsey, and of Rudolf Carnap in his later work. A key theoretical notion, distinguishable from, but intelligibly related to, information is the positive or negative relevance of a proposition or sentence to another. It is explicated in the probabilistic way familiar from Carnap and traditional in the philosophies of science and rational action. This makes it a representation of local epistemic contextchange potential that is directional in a precisely specifiable sense and naturally related to utterers ’ instrumental intentions. Relevance so defined is proposed as an explicans for Oswald Ducrot’s insightful ‘valeur argumentative’. In view of possible confusion among some students of language, it is contrasted with a more recent and idiosyncratic pretender to the appellation, due to Dan Sperber and Deirdre Wilson. The latter proposal turns out, at best, to paraphrase H.P. Grice’s nondirectional concepts of ‘informativeness ’ and ‘perspicuity’. (More informative designations are suggested for it, and for the eponymous linguistic doctrine emanating from parts of CNRS Paris and of UC London.)
Recursion Theoretic Models of Learning: Some Results and Intuitions
 Annals of Mathematics and Artificial Intelligence
, 1995
"... View of Learning To implement a program that somehow "learns" it is neccessary to fix a set of concepts to be learned and develop a representation for the concepts and examples of the concepts. In order to investigate general properties of machine learning it is neccesary to work in as re ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
View of Learning To implement a program that somehow "learns" it is neccessary to fix a set of concepts to be learned and develop a representation for the concepts and examples of the concepts. In order to investigate general properties of machine learning it is neccesary to work in as representation independent fashion as possible. In this work, we consider machines that learn programs for recursive functions. Several authors have argued that such studies are general enough to include a wide array of learning situations [2,3,22,23,24]. For example, a behavior to be learned can be modeled as a set of stimulus and response pairs. Assuming that any behavior associates only one response to each possible stimulus, behaviors can be viewed as functions from stimuli to responses. Some behaviors, such as anger, are not easily modeled as functions. Our primary interest, however, concerns the learning of fundamental behaviors such as reading (mapping symbols to sounds), recognition (mapping pa...
From Bayesian Epistemology to Inductive Logic
, 2013
"... Inductive logic admits a variety of semantics (Haenni et al., 2011, Part 1). This paper develops semantics based on the norms of Bayesian epistemology (Williamson, 2010, Chapter 7). §1 introduces the semantics and then, in §2, the paper explores methods for drawing inferences in the resulting logic ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Inductive logic admits a variety of semantics (Haenni et al., 2011, Part 1). This paper develops semantics based on the norms of Bayesian epistemology (Williamson, 2010, Chapter 7). §1 introduces the semantics and then, in §2, the paper explores methods for drawing inferences in the resulting logic and compares the methods of this paper with the methods of Barnett and Paris (2008). §3 then evaluates this Bayesian inductive logic in the light of four traditional critiques of inductive logic, arguing (i) that it is language independent in a key sense, (ii) that it admits connections with the Principle of Indifference but these connections do not lead to paradox, (iii) that it can capture the phenomenon of learning from experience, and (iv) that while the logic advocates scepticism with regard to some universal hypotheses, such scepticism is not problematic from the point of view of scientific theorising. §1 Bayesian Epistemology as Semantics for Inductive Logic
Plausibilities of plausibilities’: an approach through circumstances. Being part I of “From ‘plausibilities of plausibilities’ to stateassignment methods” (2006), eprint arXiv:quantph/0607111
"... Probabilitylike parameters appearing in some statistical models, and their prior distributions, are reinterpreted through the notion of ‘circumstance’, a term which stands for any piece of knowledge that is useful in assigning a probability and that satisfies some additional logical properties. The ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Probabilitylike parameters appearing in some statistical models, and their prior distributions, are reinterpreted through the notion of ‘circumstance’, a term which stands for any piece of knowledge that is useful in assigning a probability and that satisfies some additional logical properties. The idea, which can be traced to Laplace and Jaynes, is that the usual inferential reasonings about the probabilitylike parameters of a statistical model can be conceived as reasonings about equivalence classes of ‘circumstances ’ — viz., real or hypothetical pieces of knowledge, like e.g. physical hypotheses, that are useful in assigning a probability and satisfy some additional logical properties — that are uniquely indexed by the probability distributions they lead to. PACS numbers: 02.50.Cw,02.50.Tt,01.70.+w MSC numbers: 03B48,62F15,60A05 If you can’t join ’em, join ’em together. 0
The LaplaceJaynes approach to induction. Being part II of “From ‘plausibilities of plausibilities’ to stateassignment methods
"... An approach to induction is presented, based on the idea of analysing the context of a given problem into ‘circumstances’. This approach, fully Bayesian in form and meaning, provides a complement or in some cases an alternative to that based on de Finetti’s representation theorem and on the notion o ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
An approach to induction is presented, based on the idea of analysing the context of a given problem into ‘circumstances’. This approach, fully Bayesian in form and meaning, provides a complement or in some cases an alternative to that based on de Finetti’s representation theorem and on the notion of infinite exchangeability. In particular, it gives an alternative interpretation of those formulae that apparently involve ‘unknown probabilities ’ or ‘propensities’. Various advantages and applications of the presented approach are discussed, especially in comparison to that based on exchangeability. Generalisations are also discussed. PACS numbers: 02.50.Cw,02.50.Tt,01.70.+w MSC numbers: 03B48,60G09,60A05 Note, to head off a common misconception, that this is in no way to introduce a “probability of a probability”. It is simply convenient to index our hypotheses by parameters [...] chosen to be numerically equal to the probabilities assigned by those hypotheses; this avoids a doubling of our notation. We could easily restate everything so that the misconception could not arise; it would only be rather clumsy notationally and tedious verbally.
From “plausibilities of plausibilities” to stateassignment methods: I. “Plausibilities of plausibilities”: an approach through circumstances (2006), eprint arXiv:quantph/0607111
"... An approach to induction is presented, based on the idea of analysing the context of a given problem into ‘circumstances’. This approach, fully Bayesian in form and meaning, provides a complement or in some cases an alternative to that based on de Finetti’s representation theorem and on the notion o ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
An approach to induction is presented, based on the idea of analysing the context of a given problem into ‘circumstances’. This approach, fully Bayesian in form and meaning, provides a complement or in some cases an alternative to that based on de Finetti’s representation theorem and on the notion of infinite exchangeability. In particular, it gives an alternative interpretation of those formulae that apparently involve ‘unknown probabilities ’ or ‘propensities’. Various advantages and applications of the presented approach are discussed, especially in comparison to that based on exchangeability. Generalisations are also discussed. PACS numbers: 02.50.Cw,02.50.Tt,01.70.+w MSC numbers: 03B48,60G09,60A05 Note, to head off a common misconception, that this is in no way to introduce a “probability of a probability”. It is simply convenient to index our hypotheses by parameters [...] chosen to be numerically equal to the probabilities assigned by those hypotheses; this avoids a doubling of our notation. We could easily restate everything so that the misconception could not arise; it would only be rather clumsy notationally and tedious verbally.
§3 Goodman’s New Problem of Induction 7
"... Journal of Logic, Language and Information, to appear Bayesian probability is normally defined over a fixed language or event space. But in practice language is susceptible to change, and the question naturally arises as to how Bayesian degrees of belief should change as language changes. I argue he ..."
Abstract
 Add to MetaCart
Journal of Logic, Language and Information, to appear Bayesian probability is normally defined over a fixed language or event space. But in practice language is susceptible to change, and the question naturally arises as to how Bayesian degrees of belief should change as language changes. I argue here that this question poses a serious challenge to Bayesianism. The Bayesian may be able to meet this challenge however, and I outline a practical method for changing degrees of belief over changes in