Results 11  20
of
177
Information, relevance, and social decisionmaking: Some principles and results of decisiontheoretic semantics
 Logic, Language, and Computation
, 1999
"... Abstract. I propose to treat natural language semantics as a branch of pragmatics, identified in the way of C.S. Peirce, F.P. Ramsey, and R. Carnap as decisiontheory. The notion of relevance plays a key role. It is explicated traditionally, distinguished from a recent homophone, and applied in its ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Abstract. I propose to treat natural language semantics as a branch of pragmatics, identified in the way of C.S. Peirce, F.P. Ramsey, and R. Carnap as decisiontheory. The notion of relevance plays a key role. It is explicated traditionally, distinguished from a recent homophone, and applied in its natural framework of issuebased communication. Empirical emphasis is on implicature and presupposition. Several theorems are stated and made use of. Items analyzed include ‘or’, ‘not’, ‘but’, ‘even’, and ‘also’. I conclude on parts of mind. This paper submits an approach to meaning, with a focus on broadly nontruthconditional aspects of natural language. Semantics is treated as a branch of pragmatics, identified as decisiontheory in the way of C.S. Peirce, F.P. Ramsey, and of Rudolf Carnap in his later work. A key theoretical notion, distinguishable from, but intelligibly related to, information is the positive or negative relevance of a proposition or sentence to another. It is explicated in the probabilistic way familiar from Carnap and traditional in the philosophies of science and rational action. This makes it a representation of local epistemic contextchange potential that is directional in a precisely specifiable sense and naturally related to utterers ’ instrumental intentions. Relevance so defined is proposed as an explicans for Oswald Ducrot’s insightful ‘valeur argumentative’. In view of possible confusion among some students of language, it is contrasted with a more recent and idiosyncratic pretender to the appellation, due to Dan Sperber and Deirdre Wilson. The latter proposal turns out, at best, to paraphrase H.P. Grice’s nondirectional concepts of ‘informativeness ’ and ‘perspicuity’. (More informative designations are suggested for it, and for the eponymous linguistic doctrine emanating from parts of CNRS Paris and of UC London.)
Representing Bayesian networks within probabilistic Horn abduction
 In Proc. Seventh Conf. on Uncertainty in Artificial Intelligence
, 1991
"... This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. It is shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. The main contributions are in finding a relationship between logic ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. It is shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. The main contributions are in finding a relationship between logical and probabilistic notions of evidential reasoning. This can be used as a basis for a new way to implement Bayesian Networks that allows for approximations to the value of the posterior probabilities, and also points to a way that Bayesian networks can be extended beyond a propositional language. 1
A Bayesian Account of Independent Evidence with Application
 Philosophy of Science 68 (Proceedings): S123S140
, 2001
"... Abstract. A Bayesian account of independent evidential support is outlined. This account is partly inspired by the work of C.S. Peirce. I show that a large class of quantitative Bayesian measures of confirmation satisfy some basic desiderata suggested by Peirce for adequate accounts of independent e ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
Abstract. A Bayesian account of independent evidential support is outlined. This account is partly inspired by the work of C.S. Peirce. I show that a large class of quantitative Bayesian measures of confirmation satisfy some basic desiderata suggested by Peirce for adequate accounts of independent evidence. I argue that, by considering further natural constraints on a probabilistic account of independent evidence, all but a very small class of Bayesian measures of confirmation can be ruled out. In closing, another application of my account to the problemof evidential diversity is also discussed. 1 Terminology, Notation & Basic Assumptions The present paper is concerned with the degree of incremental confirmation provided by evidential propositions E for hypotheses under test H, givenbackground evidence K, according to relevance measures of degree of confirmation c. Wesaythatcis a relevance measure of degree of confirmation if and only if c satisfies the following constraints, in cases where E confirms, disconfirms, or is confirmationally irrelevant to H, given background evidence K.
Bluff your way in the second law of thermodynamics
 STUD. HIST. PHIL. MOD. PHYS
, 2001
"... The aim of this article is to analyse the relation between the second law of thermodynamics and the socalled arrow of time. For this purpose, a number of different aspects in this arrow of time are distinguished, in particular those of time(a)symmetry and of (ir)reversibility. Next I review versio ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
The aim of this article is to analyse the relation between the second law of thermodynamics and the socalled arrow of time. For this purpose, a number of different aspects in this arrow of time are distinguished, in particular those of time(a)symmetry and of (ir)reversibility. Next I review versions of the second law in the work of Carnot, Clausius, Kelvin, Planck, Gibbs, Carathéodory and Lieb and Yngvason, and investigate their connection with these aspects of the arrow of time. It is shown that this connection varies a great deal along with these formulations of the second law. According to the famous formulation by Planck, the second law expresses the irreversibility of natural processes. But in many other formulations irreversibility or even timeasymmetry plays no role. I therefore argue for the view that the second law has nothing to do with the arrow of time.
On Reichenbach's common cause principle and Reichenbach's notion of common cause
"... It is shown that, given any finite set of pairs of random events in a Boolean algebra which are correlated with respect to a fixed probability measure on the algebra, the algebra can be extended in such a way that the extension contains events that can be regarded as common causes of the correlation ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
It is shown that, given any finite set of pairs of random events in a Boolean algebra which are correlated with respect to a fixed probability measure on the algebra, the algebra can be extended in such a way that the extension contains events that can be regarded as common causes of the correlations in the sense of Reichenbach's definition of common cause. It is shown, further, that, given any quantum probability space and any set of commuting events in it which are correlated with respect to a fixed quantum state, the quantum probability space can be extended in such a way that the extension contains common causes of all the selected correlations, where common cause is again taken in the sense of Reichenbach's definition. It is argued that these results very strongly restrict the possible ways of disproving Reichenbach's Common Cause Principle.
Reichenbach's Common Cause Principle and Quantum Field Theory
, 1997
"... Reichenbach's principle of a probabilistic common cause of probabilistic correlations is formulated in terms of relativistic quantum field theory and the problem is raised whether correlations in relativistic quantum field theory between events represented by projections in local observable algebras ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Reichenbach's principle of a probabilistic common cause of probabilistic correlations is formulated in terms of relativistic quantum field theory and the problem is raised whether correlations in relativistic quantum field theory between events represented by projections in local observable algebras A(V1) and A(V2) pertaining to spacelike separated spacetime regions V1 and V2 can be explained by finding a probabilistic common cause of the correlation in Reichenbach's sense. While this problem remains open, it is shown that if all superluminal correlations predicted by the vacuum state between events in A(V1) and A(V2) have a genuinely probabilistic common cause, then the local algebras A(V1) and A(V2) must be statistically independent in the sense of C*independence.
Causal inference using the algorithmic Markov condition
, 2008
"... Inferring the causal structure that links n observables is usually basedupon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to g ..."
Abstract

Cited by 11 (11 self)
 Add to MetaCart
Inferring the causal structure that links n observables is usually basedupon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information anddescribe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also discuss how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution. email:
Foundations for Bayesian networks
, 2001
"... Bayesian networks are normally given one of two types of foundations: they are either treated purely formally as an abstract way of representing probability functions, or they are interpreted, with some causal interpretation given to the graph in a network and some standard interpretation of probabi ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
Bayesian networks are normally given one of two types of foundations: they are either treated purely formally as an abstract way of representing probability functions, or they are interpreted, with some causal interpretation given to the graph in a network and some standard interpretation of probability given to the probabilities specified in the network. In this chapter I argue that current foundations are problematic, and put forward new foundations which involve aspects of both the interpreted and the formal approaches. One standard approach is to interpret a Bayesian network objectively: the graph in a Bayesian network represents causality in the world and the specified probabilities are objective, empirical probabilities. Such an interpretation founders when the Bayesian network independence assumption (often called the causal Markov condition) fails to hold. In §2 I catalogue the occasions when the independence assumption fails, and show that such failures are pervasive. Next, in §3, I show that even where the independence assumption does hold objectively, an agent’s causal knowledge is unlikely to satisfy the assumption with respect to her subjective probabilities, and that slight differences between an agent’s subjective Bayesian network and an objective Bayesian network can lead to large differences between probability distributions determined by these networks. To overcome these difficulties I put forward logical Bayesian foundations in §5. I show that if the graph and probability specification in a Bayesian network are thought of as an agent’s background knowledge, then the agent is most rational if she adopts the probability distribution determined by the
Causal Inference By Choosing Graphs With Most Plausible Markov Kernels
 MARKOV KERNELS, NINTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL INTELLIGENCE AND MATHEMATICS
, 2006
"... We propose a new inference rule for estimating causal structure that underlies the observed statistical dependencies among n random variables. Our method is based on comparing the conditional distributions of variables given their direct causes (the socalled "Markov kernels") for all hypothetica ..."
Abstract

Cited by 11 (10 self)
 Add to MetaCart
We propose a new inference rule for estimating causal structure that underlies the observed statistical dependencies among n random variables. Our method is based on comparing the conditional distributions of variables given their direct causes (the socalled "Markov kernels") for all hypothetical causal directions and choosing the most plausible one. We consider those Markov kernels most plausible, which maximize the (conditional) entropies constrained by their observed first moment (expectation) and second moments (variance and covariance with its direct causes) based on their given domain. In this
Thermodynamics and Garbage Collection
 In ACM Sigplan Notices
, 1994
"... INTRODUCTION Computer scientists should have a knowledge of abstract statistical thermodynamics. First, computer systems are dynamical systems, much like physical systems, and therefore an important first step in their characterization is in finding properties and parameters that are constant over ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
INTRODUCTION Computer scientists should have a knowledge of abstract statistical thermodynamics. First, computer systems are dynamical systems, much like physical systems, and therefore an important first step in their characterization is in finding properties and parameters that are constant over time (i.e., constants of motion). Second, statistical thermodynamics successfully reduces macroscopic properties of a system to the statistical behavior of large numbers of microscopic processes. As computer systems become large assemblages of small components, an explanation of their macroscopic behavior may also be obtained as the aggregate statistical behavior of its component parts. If not, the elegance of the statistical thermodynamical approach can at least provide inspiration for new classes of models. 1 Third, the components of computer systems are approaching the same size as the microscopic pr