Results 1  10
of
11
Foundations for Bayesian networks
, 2001
"... Bayesian networks are normally given one of two types of foundations: they are either treated purely formally as an abstract way of representing probability functions, or they are interpreted, with some causal interpretation given to the graph in a network and some standard interpretation of probabi ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
Bayesian networks are normally given one of two types of foundations: they are either treated purely formally as an abstract way of representing probability functions, or they are interpreted, with some causal interpretation given to the graph in a network and some standard interpretation of probability given to the probabilities specified in the network. In this chapter I argue that current foundations are problematic, and put forward new foundations which involve aspects of both the interpreted and the formal approaches. One standard approach is to interpret a Bayesian network objectively: the graph in a Bayesian network represents causality in the world and the specified probabilities are objective, empirical probabilities. Such an interpretation founders when the Bayesian network independence assumption (often called the causal Markov condition) fails to hold. In §2 I catalogue the occasions when the independence assumption fails, and show that such failures are pervasive. Next, in §3, I show that even where the independence assumption does hold objectively, an agent’s causal knowledge is unlikely to satisfy the assumption with respect to her subjective probabilities, and that slight differences between an agent’s subjective Bayesian network and an objective Bayesian network can lead to large differences between probability distributions determined by these networks. To overcome these difficulties I put forward logical Bayesian foundations in §5. I show that if the graph and probability specification in a Bayesian network are thought of as an agent’s background knowledge, then the agent is most rational if she adopts the probability distribution determined by the
Observation and Superselection in Quantum Mechanics
, 1995
"... We attempt to clarify the main conceptual issues in approaches to 'objectification' or 'measurement' in quantum mechanics which are based on superselection rules. Such approaches venture to derive the emergence of classical 'reality' relative to a class of observers; those believing that the clas ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We attempt to clarify the main conceptual issues in approaches to 'objectification' or 'measurement' in quantum mechanics which are based on superselection rules. Such approaches venture to derive the emergence of classical 'reality' relative to a class of observers; those believing that the classical world exists intrinsically and absolutely are advised against reading this paper. The prototype approach (K. Hepp, Helv. Phys. Acta 45 (1972), 237248) where superselection sectors are assumed in the state space of the apparatus is shown to be untenable. Instead, one should couple system and apparatus to an environment, and postulate superselection rules for the latter. These are motivated by the locality of any observer or other (actual or virtual) monitoring system. In this way 'environmental' solutions to the measurement problem (H. D. Zeh, Found. Phys.
Countable Additivity and Subjective Probability
 British Journal for the Philosophy of Science
, 1999
"... While there are several arguments on either side, it is far from clear as to whether or not countable additivity is an acceptable axiom of subjective probability. I focus here on de Finetti's central argument against countable additivity and provide a new Dutch book proof of the principle, to argue ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
While there are several arguments on either side, it is far from clear as to whether or not countable additivity is an acceptable axiom of subjective probability. I focus here on de Finetti's central argument against countable additivity and provide a new Dutch book proof of the principle, to argue that if we accept the Dutch book foundations of subjective probability, countable additivity is an unavoidable constraint.
Joint Probabilities
"... When combining information from multiple sources and attempting to estimate the probability of a conclusion, we often find ourselves in the position of knowing the probability of the conclusion conditional on each of the individual sources, but we have no direct information about the probability of ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
When combining information from multiple sources and attempting to estimate the probability of a conclusion, we often find ourselves in the position of knowing the probability of the conclusion conditional on each of the individual sources, but we have no direct information about the probability of the conclusion conditional on the combination of sources. The probability calculus provides no way of computing such joint probabilities. This paper introduces a new way of combining probabilistic information to estimate joint probabilities. It is shown that on a particular conception of objective probabilities, clear sense can be made of secondorder probabilities (probabilities of probabilities), and these can be related to combinatorial theorems about proportions in finite sets as the sizes of the sets go to infinity. There is a rich mathematical theory consisting of such theorems, and the theorems generate corresponding theorems about secondorder probabilities. Among the latter are a number of theorems to the effect that certain inferences from probabilities to probabilities, although not licensed by the probability calculus, have probability 1 of producing correct results. This does not mean that they will always produce correct results, but the set of cases in which the inferences go wrong form a set of measure 0. Among these
Probabilities are singlecase, or nothing
 Optics and Spectroscopy
, 2005
"... Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather th ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state. 1 1.
A Note on Global Descriptivism and Putnam's ModelTheoretic Argument
, 1998
"... According to Putnam's modeltheoretic argument, an epistemically ideal theory cannot fail to be true. Lewis contends that all the argument really shows is that an epistemically ideal theory must be true provided a certain theory of referencewhich he terms Global Descriptivismis the whole truth ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
According to Putnam's modeltheoretic argument, an epistemically ideal theory cannot fail to be true. Lewis contends that all the argument really shows is that an epistemically ideal theory must be true provided a certain theory of referencewhich he terms Global Descriptivismis the whole truth about reference, which he emphatically denies. In this note it is argued that Lewis grants Putnam too much. However implausible Global Descriptivism may be as a comprehensive account of reference, on what appears to be the only reasonable construal of it Global Descriptivism does not imply that an epistemically ideal theory must be true. Define Realism as the thesis that even an epistemically ideal theory of the world is not guaranteed to be true. Putnam's modeltheoretic argument ([5], [6]) argues against this thesis: an epistemically ideal theory cannot fail to be true, so the argument's conclusion reads. Lewis, in his [3], contends that all the argument really shows is that an epistemica...
Facts, Values and Quanta
 Foundations of Physics
, 2005
"... Quantum mechanics is a fundamentally probabilistic theory (at least so far as the empirical predictions are concerned). It follows that, if one wants to properly understand quantum mechanics, it is essential to clearly understand the meaning of probability statements. The interpretation of probabili ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Quantum mechanics is a fundamentally probabilistic theory (at least so far as the empirical predictions are concerned). It follows that, if one wants to properly understand quantum mechanics, it is essential to clearly understand the meaning of probability statements. The interpretation of probability has excited nearly as much philosophical controversy as the interpretation of quantum mechanics. 20 th century physicists have mostly adopted a frequentist conception. In this paper it is argued that we ought, instead, to adopt a logical or Bayesian conception. The paper includes a comparison of the orthodox and Bayesian theories of statistical inference. It concludes with a few remarks concerning the implications for the concept of physical reality. 1 1.
SelfOrganizing Maps as Traveling Computational Templates
"... Abstract In this article we approach neural networks as computational templates that travel across various sciences. Traditionally, it has been thought that models are primarily models of some target systems: they are assumed to represent partially or completely their target systems. We argue, inste ..."
Abstract
 Add to MetaCart
Abstract In this article we approach neural networks as computational templates that travel across various sciences. Traditionally, it has been thought that models are primarily models of some target systems: they are assumed to represent partially or completely their target systems. We argue, instead, that many computational models cannot easily be conceived of in representational terms. Rather, they can be seen as models for various epistemic endeavors. Apart from dealing with the question of representation, we discuss also what implications the genuinely crossdisciplinary computational templates such as neural networks have for the organization of science. We use Selforganizing maps as an example through which we study the aforementioned questions. I.
Science and Religion: The Immersion Solution
, 2005
"... This essay focuses on the cognitive tension between science and religion, in particular on the contradictions between some of the claims of current science and some of the claims in religious texts. My aim is to suggest how some work in the philosophy of science may help to manage this tension. Thus ..."
Abstract
 Add to MetaCart
This essay focuses on the cognitive tension between science and religion, in particular on the contradictions between some of the claims of current science and some of the claims in religious texts. My aim is to suggest how some work in the philosophy of science may help to manage this tension. Thus I will attempt to apply
1 QUANTUM MECHANICS DOES NOT REQUIRE THE CONTINUITY OF SPACE
, 2002
"... We show that the experimental verification of Newtonian mechanics and of nonrelativistic quantum mechanics do not imply that space is continuous. This provides evidence against the realist interpretation of the most mathematical parts of physics. 1 ..."
Abstract
 Add to MetaCart
We show that the experimental verification of Newtonian mechanics and of nonrelativistic quantum mechanics do not imply that space is continuous. This provides evidence against the realist interpretation of the most mathematical parts of physics. 1