Results 1 
5 of
5
BNT structure learning package: documentation and experiments
 Technical Report FRE CNRS 2645). Laboratoire PSI, Universitè et INSA de Rouen
, 2004
"... Bayesian networks are a formalism for probabilistic reasonning that is more and more used for classification task in datamining. In some situations, the network structure is given by an expert, otherwise, retrieving it from a database is a NPhard problem, notably because of the search space comple ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Bayesian networks are a formalism for probabilistic reasonning that is more and more used for classification task in datamining. In some situations, the network structure is given by an expert, otherwise, retrieving it from a database is a NPhard problem, notably because of the search space complexity. In the last decade, lot of methods have been introduced to learn the network structure automatically, by simplifying the search space (augmented naive bayes, K2) or by using an heuristic in this search space (greedy search). Most of these methods deal with completely observed data, but some others can deal with incomplete data (SEM, MWSTEM). The Bayes Net Toolbox introduced by [Murphy, 2001a] for Matlab allows us using Bayesian Networks or learning them. But this toolbox is not ’state of the art ’ if we want to perform a Structural Learning, that’s why we propose this package.
Causal Graphical Models with Latent Variables: Learning and Inference
"... Several paradigms exist for modeling causal graphical models for discrete variables that can handle latent variables without explicitly modeling them quantitatively. Applying them to a problem domain consists of different steps: structure learning, parameter learning and using them for probabilisti ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Several paradigms exist for modeling causal graphical models for discrete variables that can handle latent variables without explicitly modeling them quantitatively. Applying them to a problem domain consists of different steps: structure learning, parameter learning and using them for probabilistic or causal inference. We discuss two wellknown formalisms, namely semiMarkovian causal models and maximal ancestral graphs and indicate their strengths and limitations. Previously an algorithm has been constructed that by combining elements from both techniques allows to learn a semiMarkovian causal models from a mixture of observational and experimental data. The goal of this paper is to recapitulate the integral learning process from observational and experimental data and to demonstrate how different types of inference can be performed efficiently in the learned models. We will do this by proposing an alternative representation for semiMarkovian causal models.
Learning semimarkovian causal models using experiments
 IN: PROCEEDINGS OF THE THIRD EUROPEAN WORKSHOP ON PROBABILISTIC GRAPHICAL MODELS , PGM
, 2006
"... SemiMarkovian causal models (SMCMs) are an extension of causal Bayesian networks for modeling problems with latent variables. However, there is a big gap between the SMCMs used in theoretical studies and the models that can be learned from observational data alone. The result of standard algorithms ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
SemiMarkovian causal models (SMCMs) are an extension of causal Bayesian networks for modeling problems with latent variables. However, there is a big gap between the SMCMs used in theoretical studies and the models that can be learned from observational data alone. The result of standard algorithms for learning from observations, is a complete partially ancestral graph (CPAG), representing the Markov equivalence class of maximal ancestral graphs (MAGs). In MAGs not all edges can be interpreted as immediate causal relationships. In order to apply stateoftheart causal inference techniques we need to completely orient the learned CPAG and to transform the result into a SMCM by removing noncausal edges. In this paper we combine recent work on MAG structure learning from observational data with causal learning from experiments in order to achieve that goal. More specifically, we provide a set of rules that indicate which experiments are needed in order to transform a CPAG to a completely oriented SMCM and how the results of these experiments have to be processed. We will propose an alternative representation for SMCMs that can easily be parametrised and where the parameters can be learned with classical methods. Finally, we show how this parametrisation can be used to develop methods to efficiently perform both probabilistic and causal inference.
Experimental Learning of Causal Models with Latent Variables
, 2006
"... This article discusses graphical models that can handle latent variables without explicitly modeling them quantitatively. There exist several paradigms for such problem domains. Two of them are semiMarkovian causal models and maximal ..."
Abstract
 Add to MetaCart
(Show Context)
This article discusses graphical models that can handle latent variables without explicitly modeling them quantitatively. There exist several paradigms for such problem domains. Two of them are semiMarkovian causal models and maximal
APPROVAL
, 2007
"... ii New statistical methods allow discovery of causal models from observational data in some circumstances. These models permit both probabilistic inference and causal inference for models of reasonable size. Many domains, such as education, can benefit from such methods. Educational research does no ..."
Abstract
 Add to MetaCart
ii New statistical methods allow discovery of causal models from observational data in some circumstances. These models permit both probabilistic inference and causal inference for models of reasonable size. Many domains, such as education, can benefit from such methods. Educational research does not easily lend itself to experimental investigation. Research in laboratories is artificial and potentially affects measurement; research in authentic environments is extremely complex and difficult to control. In both environments, the variables are typically hidden and only change over the long term, making them challenging and expensive to investigate experimentally. I present an analysis of causal discovery algorithms and their applicability to educational research, an engineered causal model of SelfRegulated Learning (SRL) theory based on the literature, and an evaluation of the potential for discovering such a theory from observational data using the new statistical methods and suggest possible benefits of such work. iii To my wife iv This too shall pass