Results 1  10
of
17
Foundations for Bayesian networks
, 2001
"... Bayesian networks are normally given one of two types of foundations: they are either treated purely formally as an abstract way of representing probability functions, or they are interpreted, with some causal interpretation given to the graph in a network and some standard interpretation of probabi ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
Bayesian networks are normally given one of two types of foundations: they are either treated purely formally as an abstract way of representing probability functions, or they are interpreted, with some causal interpretation given to the graph in a network and some standard interpretation of probability given to the probabilities specified in the network. In this chapter I argue that current foundations are problematic, and put forward new foundations which involve aspects of both the interpreted and the formal approaches. One standard approach is to interpret a Bayesian network objectively: the graph in a Bayesian network represents causality in the world and the specified probabilities are objective, empirical probabilities. Such an interpretation founders when the Bayesian network independence assumption (often called the causal Markov condition) fails to hold. In §2 I catalogue the occasions when the independence assumption fails, and show that such failures are pervasive. Next, in §3, I show that even where the independence assumption does hold objectively, an agent’s causal knowledge is unlikely to satisfy the assumption with respect to her subjective probabilities, and that slight differences between an agent’s subjective Bayesian network and an objective Bayesian network can lead to large differences between probability distributions determined by these networks. To overcome these difficulties I put forward logical Bayesian foundations in §5. I show that if the graph and probability specification in a Bayesian network are thought of as an agent’s background knowledge, then the agent is most rational if she adopts the probability distribution determined by the
Evolutionary Theory and the Reality of Macro Probabilities
"... Evolutionary theory is awash with probabilities. For example, natural selection is said to occur when there is variation in fitness, and fitness is standardly decomposed into two components, viability and fertility, each of which is understood probabilistically. With respect to viability, a fertiliz ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
Evolutionary theory is awash with probabilities. For example, natural selection is said to occur when there is variation in fitness, and fitness is standardly decomposed into two components, viability and fertility, each of which is understood probabilistically. With respect to viability, a fertilized egg is said to have a certain chance of surviving to reproductive age; with respect to fertility, an adult is said to have an expected number of offspring. There is more to evolutionary theory than the theory of natural selection, and here too one finds probabilistic concepts aplenty. When there is no selection, the theory of neutral evolution says that a gene’s chance of eventually reaching fixation is 1/(2N), where N is the number of organisms in the generation of the diploid population to which the gene belongs. The evolutionary consequences of mutation are likewise conceptualized in terms of the probability per unit time a gene has of changing from one state to another. The examples just mentioned are all “forwarddirected” probabilities; they describe the probability of later events, conditional on earlier events. However, evolutionary theory also uses “backwards probabilities ” that describe the probability of a cause conditional on its effects; for example, coalescence theory allows one to calculate the expected number of generations in the past that the genes in the present generation find their most recent common
The Direction of Causation: Ramsey's Ultimate Contingency
, 1992
"... Introduction. Our present concern originates with two uncontroversial observations about causation: the causal relation is asymmetric, so that if A is a cause of B then B is not a cause of A; and effects never (or almost never) occur before their causes. Uncontroversial as they may be, these feature ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Introduction. Our present concern originates with two uncontroversial observations about causation: the causal relation is asymmetric, so that if A is a cause of B then B is not a cause of A; and effects never (or almost never) occur before their causes. Uncontroversial as they may be, these features of causation are far from unproblematic. A philosophical theory of causation thus has these two nontrivial tasks, among others: to explicate the difference between cause and effectto reveal the true content of the "arrow" of causation, so to speakand to explain why the arrow of causation is so well aligned with the arrow of time. Note that the latter task permits two readings, depending on whether the temporal reference is read rigidly. On the stronger rigid or de re reading, the question is why the causal arrow points in this particular temporal direction, thought of as fixed independently of our disposition to treat the direction in question as that
A Probability Index of the Robustness of a Causal Inference
"... Causal inference is an important, controversial topic in the social sciences, where it is difficult to conduct experiments or measure and control for all confounding variables. To address this concern, the present study presents a probability index to assess the robustness of a causal inference to t ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Causal inference is an important, controversial topic in the social sciences, where it is difficult to conduct experiments or measure and control for all confounding variables. To address this concern, the present study presents a probability index to assess the robustness of a causal inference to the impact of a confounding variable. The information from existing covariates is used to develop a reference distribution for gauging the likelihood of observing a given value of the impact of a confounding variable. Applications are illustrated with an empirical example pertaining to educational attainment. The methodology discussed in this study allows for multiple partial causes in the complex social phenomena that we study, and informs the controversy about causal inference that arises from the use of statistical models in the social sciences.
An Empirical Critique of Two Versions of the Doomsday Argument  Gott's Line and Leslie's Wedge
"... I discuss two versions of the doomsday argument. According to "Gott's Line," the fact that the human race has existed for 200,000 years licences the prediction that it will last between 5100 and 7.8 million more years. According to "Leslie's Wedge," the fact that I currently exist is evidence that i ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
I discuss two versions of the doomsday argument. According to "Gott's Line," the fact that the human race has existed for 200,000 years licences the prediction that it will last between 5100 and 7.8 million more years. According to "Leslie's Wedge," the fact that I currently exist is evidence that increases the plausibility of the hypothesis that the human race will come to an end sooner rather than later. Both arguments rest on substantive assumptions about the sampling process that underlies our observations. These sampling assumptions have testable consequences, and so the sampling assumptions themselves must be regarded as empirical claims. The result of testing some of these consequences is that both doomsday arguments are empirically disconfirmed. 1. Gott's Line Richard Gott (1993, 1997) presents the following version of the doomsday argument: 1. My present temporal position can be treated as if it were the result of random sampling from the times during which S exists. 2. Hence, there is a probability of 0.95 that my present temporal position is in the middle 95% of S's duration. 3. S began at time t 0 and the present date is t 1 . 4. Hence, there is a probability of 0.95 that S will cease to exist after the passage of a period of time that is greater than (1/39)( t 1  t 0 ) and less than 39( t 1  t 0 ). 5. Hence, we may reasonably predict that S will cease to exist after the passage of a period of time that is greater than (1/39)( t 1  t 0 ) and less than 39( t 1  t 0 ). Gott used this form of reasoning in 1969 to estimate the durations of the Berlin Wall and the Soviet Union, both of which perished within the intervals he calculated. He also used this argument to predict the duration of the human race; assuming that Homo sapiens has so far existed for 20...
A Modest Proposal for Interpreting Structural Explanations
"... Social sciences face a wellknown problem, which is an instance of a general problem faced as well by psychological and biological sciences: the problem of establishing their legitimate existence alongside physics. This, as will become clear, is a problem in metaphysics. I will show how a new accoun ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Social sciences face a wellknown problem, which is an instance of a general problem faced as well by psychological and biological sciences: the problem of establishing their legitimate existence alongside physics. This, as will become clear, is a problem in metaphysics. I will show how a new account of structural explanations, put forward by Frank Jackson and Philip Pettit, which is designed to solve this metaphysical problem with social sciences in mind, fails to treat the problem in any importantly new way. Then I will propose a more modest approach, and show how it does not deserve the criticism directed at a prototype by Jackson and Pettit.
• Almost all learn patterns only
"... Since 1990, many causal discovery algorithms have been developed to learn from sample data. E.g., • IC. Verma and Pearl (1991). • PC. Spirtes, Glymour and Scheines (1993/2000). • K2. Cooper and Herskovits (1991). ..."
Abstract
 Add to MetaCart
Since 1990, many causal discovery algorithms have been developed to learn from sample data. E.g., • IC. Verma and Pearl (1991). • PC. Spirtes, Glymour and Scheines (1993/2000). • K2. Cooper and Herskovits (1991).
Coincidences and How to Think about Them
"... The naïve see causal connections everywhere. Consider the fact that Evelyn Marie Adams won the New Jersey lottery twice. The naïve find it irresistible to think that this cannot be a coincidence. Maybe the lottery was rigged or perhaps some uncanny higher power placed its hand upon her brow. Sophist ..."
Abstract
 Add to MetaCart
The naïve see causal connections everywhere. Consider the fact that Evelyn Marie Adams won the New Jersey lottery twice. The naïve find it irresistible to think that this cannot be a coincidence. Maybe the lottery was rigged or perhaps some uncanny higher power placed its hand upon her brow. Sophisticates respond with an indulgent smile and ask the naïve to view Adams’ double win within a larger perspective. Given all the lotteries there have been, it isn’t at all surprising that someone would win one of them twice. No need to invent conspiracy theories or invoke the paranormal – the double win was a mere coincidence. The naïve focus on a detailed description of the event they think needs to be explained. The New York Times reported Adams’ good fortune and said that the odds of this happening by chance are 1 in 17 trillion; this is the probability that Adams would win both lotteries if she purchased a single ticket for each and the drawings were at random. In fact, the newspaper made a small mistake here. If the goal is to calculate the probability of Adams’ winning those two lotteries, the reporter should have taken into account the fact that Adams purchased multiple tickets; the newspaper’s very low figure should thus have been somewhat higher. However, the sophisticated response is that this modest correction misses the point. For sophisticates, the relevant event to consider is not that Adams’ won those two lotteries,
Making Time Stand Still:
"... In a recent article, Elliot Sober responds to challenges to a counterexample that he posed some years earlier to the Principle of the Common Cause (PCC). I agree that Sober has indeed produced a genuine counterexample to the PCC, but argue against the methodological moral that Sober wishes to draw ..."
Abstract
 Add to MetaCart
In a recent article, Elliot Sober responds to challenges to a counterexample that he posed some years earlier to the Principle of the Common Cause (PCC). I agree that Sober has indeed produced a genuine counterexample to the PCC, but argue against the methodological moral that Sober wishes to draw from it. Contrary to Sober, I argue that the possibility of exceptions to the PCC does not undermine its status as a central assumption for methods that endeavor to draw causal conclusions from statistical data.