Results 1  10
of
11
Model selection and accounting for model uncertainty in graphical models using Occam's window
, 1993
"... We consider the problem of model selection and accounting for model uncertainty in highdimensional contingency tables, motivated by expert system applications. The approach most used currently is a stepwise strategy guided by tests based on approximate asymptotic Pvalues leading to the selection o ..."
Abstract

Cited by 266 (46 self)
 Add to MetaCart
We consider the problem of model selection and accounting for model uncertainty in highdimensional contingency tables, motivated by expert system applications. The approach most used currently is a stepwise strategy guided by tests based on approximate asymptotic Pvalues leading to the selection of a single model; inference is then conditional on the selected model. The sampling properties of such a strategy are complex, and the failure to take account of model uncertainty leads to underestimation of uncertainty about quantities of interest. In principle, a panacea is provided by the standard Bayesian formalism which averages the posterior distributions of the quantity of interest under each of the models, weighted by their posterior model probabilities. Furthermore, this approach is optimal in the sense of maximising predictive ability. However, this has not been used in practice because computing the posterior model probabilities is hard and the number of models is very large (often greater than 1011). We argue that the standard Bayesian formalism is unsatisfactory and we propose an alternative Bayesian approach that, we contend, takes full account of the true model uncertainty byaveraging overamuch smaller set of models. An efficient search algorithm is developed for nding these models. We consider two classes of graphical models that arise in expert systems: the recursive causal models and the decomposable
Dependency networks for inference, collaborative filtering, and data visualization
 Journal of Machine Learning Research
"... We describe a graphical model for probabilistic relationshipsan alternative tothe Bayesian networkcalled a dependency network. The graph of a dependency network, unlike aBayesian network, is potentially cyclic. The probability component of a dependency network, like aBayesian network, is a set of ..."
Abstract

Cited by 159 (10 self)
 Add to MetaCart
We describe a graphical model for probabilistic relationshipsan alternative tothe Bayesian networkcalled a dependency network. The graph of a dependency network, unlike aBayesian network, is potentially cyclic. The probability component of a dependency network, like aBayesian network, is a set of conditional distributions, one for each nodegiven its parents. We identify several basic properties of this representation and describe a computationally e cient procedure for learning the graph and probability components from data. We describe the application of this representation to probabilistic inference, collaborative ltering (the task of predicting preferences), and the visualization of acausal predictive relationships.
Bayesian Model Averaging And Model Selection For Markov Equivalence Classes Of Acyclic Digraphs
 Communications in Statistics: Theory and Methods
, 1996
"... Acyclic digraphs (ADGs) are widely used to describe dependences among variables in multivariate distributions. In particular, the likelihood functions of ADG models admit convenient recursive factorizations that often allow explicit maximum likelihood estimates and that are well suited to building B ..."
Abstract

Cited by 38 (5 self)
 Add to MetaCart
Acyclic digraphs (ADGs) are widely used to describe dependences among variables in multivariate distributions. In particular, the likelihood functions of ADG models admit convenient recursive factorizations that often allow explicit maximum likelihood estimates and that are well suited to building Bayesian networks for expert systems. There may, however, be many ADGs that determine the same dependence (= Markov) model. Thus, the family of all ADGs with a given set of vertices is naturally partitioned into Markovequivalence classes, each class being associated with a unique statistical model. Statistical procedures, such as model selection or model averaging, that fail to take into account these equivalence classes, may incur substantial computational or other inefficiencies. Recent results have shown that each Markovequivalence class is uniquely determined by a single chain graph, the essential graph, that is itself Markovequivalent simultaneously to all ADGs in the equivalence clas...
Split models for contingency tables
, 2003
"... A framework for loglinear models with context specific independence structures, i.e. conditional independencies holding only for specific values of the conditioning variables is introduced. This framework is constituted by the class of split models. Also a software package named YGGDRASIL which is ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
A framework for loglinear models with context specific independence structures, i.e. conditional independencies holding only for specific values of the conditioning variables is introduced. This framework is constituted by the class of split models. Also a software package named YGGDRASIL which is designed for statistical inference in split models is presented. Split models are an extension of graphical models for contingency tables. The treatment of split models includes estimation, representation and a Markov property for reading off independencies holding in a specific context. Two examples, including an illustration of the use of YGGDRASIL are
Enhancing the Predictive Performance of Bayesian Graphical Models
 Communications in Statistics – Theory and Methods
, 1995
"... Both knowledgebased systems and statistical models are typically concerned with making predictions about future observables. Here we focus on assessment of predictive performance and provide two techniques for improving the predictive performance of Bayesian graphical models. First, we present Baye ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Both knowledgebased systems and statistical models are typically concerned with making predictions about future observables. Here we focus on assessment of predictive performance and provide two techniques for improving the predictive performance of Bayesian graphical models. First, we present Bayesian model averaging, a technique for accounting for model uncertainty. Second, we describe a technique for eliciting a prior distribution for competing models from domain experts. We explore the predictive performance of both techniques in the context of a urological diagnostic problem. KEYWORDS: Prediction; Bayesian graphical model; Bayesian network; Decomposable model; Model uncertainty; Elicitation. 1 Introduction Both statistical methods and knowledgebased systems are typically concerned with combining information from various sources to make inferences about prospective measurements. Inevitably, to combine information, we must make modeling assumptions. It follows that we should car...
Efficient Bayesian inference for multivariate probit models with sparse inverse covariance matrices
 Journal of Computational and Graphical Statistics
, 2012
"... We propose a Bayesian approach for inference in the multivariate probit model, taking into account the association structure between binary observations. We model the association through the correlation matrix of the latent Gaussian variables. Conditional independence is imposed by setting some off ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We propose a Bayesian approach for inference in the multivariate probit model, taking into account the association structure between binary observations. We model the association through the correlation matrix of the latent Gaussian variables. Conditional independence is imposed by setting some offdiagonal elements of the inverse correlation matrix to zero and this sparsity structure is modeled using a decomposable graphical model. We propose an efficient Markov chain Monte Carlo algorithm relying on a parameter expansion scheme to sample from the resulting posterior distribution. This algorithm updates the correlation matrix within a simple Gibbs sampling framework and allows us to infer the correlation structure from the data, generalizing methods used for inference in decomposable Gaussian graphical models to multivariate binary observations. We demonstrate the performance of this model and of the Markov chain Monte Carlo algorithm on simulated and real data sets.
YGGDRASIL  A statistical package for learning Split Models
, 2000
"... There are two main objectives of this paper. The first is to present a statistical framework for models with context specific independence structures, i.e. conditional independencies holding only for specific values of the conditioning variables. This framework is constituted by the class of s ..."
Abstract
 Add to MetaCart
There are two main objectives of this paper. The first is to present a statistical framework for models with context specific independence structures, i.e. conditional independencies holding only for specific values of the conditioning variables. This framework is constituted by the class of split models. Split models are an extension of graphical models for contingency tables and allow for a more sophisticated modelling than graphical models. The treatment of split models include estimation, representation and a Markov property for reading off those independencies holding in a specific context. The second objective is to present a software package named YGGDRASIL which is designed for statistical inference in split models, i.e. for learning such models on the basis of data. 1 INTRODUCTION Recently there has been an increased interest in models which explicitly account for conditional independencies holding only for specific values of the variables conditioned upon. ...
A general proposal construction for reversible jump MCMC
, 2009
"... We propose a general methodology to construct proposal densities in reversible jump MCMC algorithms so that consistent mappings across competing models are achieved. Unlike nearly all previous approaches our proposals are not restricted to operate to moves between local models, but they are applicab ..."
Abstract
 Add to MetaCart
We propose a general methodology to construct proposal densities in reversible jump MCMC algorithms so that consistent mappings across competing models are achieved. Unlike nearly all previous approaches our proposals are not restricted to operate to moves between local models, but they are applicable even to models that do not share any common parameters. We focus on linear regression models and produce concrete guidelines on proposal choices for moves between any models. These guidelines can be immediately applied to any regression models after applying some standard data transformations to nearnormality. We illustrate our methodology by providing concrete guidelines for model determination problems in logistic regression and loglinear graphical models. Two real data analyses illustrate how our suggested proposal densities together with the resulting freedom to propose moves between any models improve the mixing of the reversible jump Metropolis algorithm.