Results 1  10
of
22
Causal discovery from a mixture of experimental and observational data
 In UAI
, 1999
"... This paper describes a Bayesian method for combining an arbitrary mixture of observational and experimental data in order to learn causal Bayesian networks. Observational data are passively observed. Experimental data, such as that produced by randomized controlled trials, result from the experiment ..."
Abstract

Cited by 63 (7 self)
 Add to MetaCart
This paper describes a Bayesian method for combining an arbitrary mixture of observational and experimental data in order to learn causal Bayesian networks. Observational data are passively observed. Experimental data, such as that produced by randomized controlled trials, result from the experimenter manipulating one or more variables (typically randomly) and observing the states of other variables. The paper presents a Bayesian method for learning the causal structure and parameters of the underlying causal process that is generating the data, given that (1) the data contains a mixture of observational and experimental case records, and (2) the causal process is modeled as a causal Bayesian network. This learning method was applied using as input various mixtures of experimental and observational data that were generated from the ALARM causal Bayesian network. In these experiments, the absolute and relative quantities of experimental and observational data were varied systematically. For each of these training datasets, the learning method was applied to predict the causal structure and to estimate the causal parameters that exist among randomly selected pairs of nodes in ALARM that are not confounded. The paper reports how these structure predictions and parameter estimates compare with the true causal structures and parameters as given by the ALARM network. 1
Causal Inference from Graphical Models
, 2001
"... Introduction The introduction of Bayesian networks (Pearl 1986b) and associated local computation algorithms (Lauritzen and Spiegelhalter 1988, Shenoy and Shafer 1990, Jensen, Lauritzen and Olesen 1990) has initiated a renewed interest for understanding causal concepts in connection with modelling ..."
Abstract

Cited by 59 (4 self)
 Add to MetaCart
Introduction The introduction of Bayesian networks (Pearl 1986b) and associated local computation algorithms (Lauritzen and Spiegelhalter 1988, Shenoy and Shafer 1990, Jensen, Lauritzen and Olesen 1990) has initiated a renewed interest for understanding causal concepts in connection with modelling complex stochastic systems. It has become clear that graphical models, in particular those based upon directed acyclic graphs, have natural causal interpretations and thus form a base for a language in which causal concepts can be discussed and analysed in precise terms. As a consequence there has been an explosion of writings, not primarily within mainstream statistical literature, concerned with the exploitation of this language to clarify and extend causal concepts. Among these we mention in particular books by Spirtes, Glymour and Scheines (1993), Shafer (1996), and Pearl (2000) as well as the collection of papers in Glymour and Cooper (1999). Very briefly, but fundamentally,
Active Learning of Causal Bayes Net Structure
, 2001
"... We propose a decision theoretic approach for deciding which interventions to perform so as to learn the causal structure of a model as quickly as possible. Without such interventions, it is impossible to distinguish between Markov equivalent models, even given infinite data. We perform online MCMC t ..."
Abstract

Cited by 37 (2 self)
 Add to MetaCart
We propose a decision theoretic approach for deciding which interventions to perform so as to learn the causal structure of a model as quickly as possible. Without such interventions, it is impossible to distinguish between Markov equivalent models, even given infinite data. We perform online MCMC to estimate the posterior over graph structures, and use importance sampling to find the best action to perform at each step. We assume the data is discretevalued and fully observed.
Searching for the causal structure of a vector autoregression
 Oxford Bulletin of Economics and Statistics
, 2003
"... We provide an accessible introduction to graphtheoretic methods for causal analysis. Building on the work of Swanson and Granger (Journal of the American Statistical Association, Vol. 92, pp. 357–367, 1997), and generalizing to a larger class of models, we show how to apply graphtheoretic methods ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
We provide an accessible introduction to graphtheoretic methods for causal analysis. Building on the work of Swanson and Granger (Journal of the American Statistical Association, Vol. 92, pp. 357–367, 1997), and generalizing to a larger class of models, we show how to apply graphtheoretic methods to selecting the causal order for a structural vector autoregression (SVAR). We evaluate the PC (causal search) algorithm in a Monte Carlo study. The PC algorithm uses tests of conditional independence to select among the possible causal orders – or at least to reduce the admissible causal orders to a narrow equivalence class. Our findings suggest that graphtheoretic methods may prove to be a useful tool in the analysis of SVARs. I. The problem of causal order Drawing on recent work on the graphtheoretic analysis of causality, we propose and evaluate a statistical procedure for identifying the contemporaneous causal order of a structural vector autoregression. *We thank Marcus Cuda for his help with programming and computational design, Derek Stimel and Ryan Brady for able research assistance, and to Oscar Jorda, Stephen Perez, and the participants
Causal Discovery from Changes
 In: Proceedings of UAI 2001
, 2001
"... We propose a new method of discovering causal structures, based on the detection of local, spontaneous changes in the underlying datagenerating model. We analyze the classes of structures that are equivalent relative to a stream of distributions produced by local changes, and devise algorithm ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
We propose a new method of discovering causal structures, based on the detection of local, spontaneous changes in the underlying datagenerating model. We analyze the classes of structures that are equivalent relative to a stream of distributions produced by local changes, and devise algorithms that output graphical representations of these equivalence classes. We present experimental results, using simulated data, and examine the errors associated with detection of changes and recovery of structures. 1
Mind Change Optimal Learning of Bayes Net Structure". O.Schulte
 in Proceedings of the 20th Annual Conference on Learning Theory
, 2007
"... Abstract. This paper analyzes the problem of learning the structure of a Bayes net (BN) in the theoretical framework of Gold’s learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraintbased approa ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Abstract. This paper analyzes the problem of learning the structure of a Bayes net (BN) in the theoretical framework of Gold’s learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraintbased approaches to learning Bayes net structure, where learning is based on observed conditional dependencies between variables of interest (e.g., “X is dependent on Y given any assignment to variable Z”). Applying learning criteria in this model leads to the following results. (1) The mind change complexity of identifying a Bayes net graph over variables V from dependency data is � � V, the maximum number of 2 edges. (2) There is a unique fastest mindchange optimal Bayes net learner; convergence speed is evaluated using Gold’s dominance notion of “uniformly faster convergence”. This learner conjectures a graph if it is the unique Bayes net pattern that satisfies the observed dependencies with a minimum number of edges, and outputs “no guess ” otherwise. Therefore we are using standard learning criteria to define a natural and novel Bayes net learning algorithm. We investigate the complexity of computing the output of the fastest mindchange optimal learner, and show that this problem is NPhard (assuming P=RP). To our knowledge this is the first NPhardness result concerning the existence of a uniquely optimal Bayes net structure. 1
Causal Discovery from Changes: a Bayesian Approach
 In Proceedings of UAI 17
, 2001
"... We propose a new method of discovering causal structures, based on the detection of local, spontaneous changes in the underlying datagenerating model. We derive expressions for the Bayesian score that a causal structure should obtain from streams of data produced by locally changing distribut ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We propose a new method of discovering causal structures, based on the detection of local, spontaneous changes in the underlying datagenerating model. We derive expressions for the Bayesian score that a causal structure should obtain from streams of data produced by locally changing distributions. Simulation experiments indicate that dynamic information may improve the power of discovery up to the theoretical limits set by statistical indistinguishability. 1
Summary of biosurveillancerelevant technologies
, 2003
"... This short report, compiled upon request from Dave Siegrist and Ted Senator, surveys the spectrum of technologies that can help with Biosurveillance. We indicate which we have chosen, so far, to use in our development of analysis methods and our reasons. 1 Timeweighted averaging This is directly ap ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
This short report, compiled upon request from Dave Siegrist and Ted Senator, surveys the spectrum of technologies that can help with Biosurveillance. We indicate which we have chosen, so far, to use in our development of analysis methods and our reasons. 1 Timeweighted averaging This is directly applicable to a scalar signal (such as “number of respiratory cases today”. This method, more commonly used in computational finance, simply compares the count during the current time period with the weighted average of the counts of recent days. Exponential weighting is typically used, where the halflife is known as the “time window ” parameter. This timewindow parameter is typically chosen by hand. We prefer the Serfling and Univariate HMM methods described below. 2 Serfling method This method (Serfling, 1963) is a cyclic regression model, and is the standard CDC algorithm for flu detection. It is, again, applicable to scalar signals. It assumes that the signal follows a sinusoid with a period of one year, and thus finds the four parameters ¢¤£¦¥¨ § and © in where the parameters are chosen to minimize the sum of squares of residuals. It is an easy matter of regression analysis to determine, on any date, whether