Results 1  10
of
17
Axioms for probability and belieffunction propagation
 Uncertainty in Artificial Intelligence
, 1990
"... In this paper, we describe an abstract framework and axioms under which exact local computation of marginals is possible. The primitive objects of the framework are variables and valuations. The primitive operators of the framework are combination and marginalization. These operate on valuations. We ..."
Abstract

Cited by 135 (17 self)
 Add to MetaCart
In this paper, we describe an abstract framework and axioms under which exact local computation of marginals is possible. The primitive objects of the framework are variables and valuations. The primitive operators of the framework are combination and marginalization. These operate on valuations. We state three axioms for these operators and we derive the possibility of local computation from the axioms. Next, we describe a propagation scheme for computing marginals of a valuation when we have a factorization of the valuation on a hypertree. Finally we show how the problem of computing marginals of joint probability distributions and joint belief functions fits the general framework. 1.
Binary models for marginal independence
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B
, 2005
"... A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a versi ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a version of the Iterated Conditional Fitting algorithm. The approach is illustrated on a simple example. Relations to multivariate logistic and dependence ratio models are discussed.
Reduction of Computational Complexity in Bayesian Networks through Removal of Weak Dependences
 IN PROC. TENTH CONF. ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE
, 1994
"... The paper presents a method for reducing the computational complexity of Bayesian networks through identification and removal of weak dependences (removal of links from the (moralized) independence graph). The removal of a small number of links may reduce the computational complexity dramatically, s ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
The paper presents a method for reducing the computational complexity of Bayesian networks through identification and removal of weak dependences (removal of links from the (moralized) independence graph). The removal of a small number of links may reduce the computational complexity dramatically, since several fillins and moral links may be rendered superfluous by the removal. The method is described in terms of impact on the independence graph, the junction tree, and the potential functions associated with these. An empirical evaluation of the method using large realworld networks demonstrates the applicability of the method. Further, the method, which has been implemented in Hugin, complements the approximation method suggested by Jensen & Andersen (1990).
Lectures on Contingency Tables
, 2002
"... The present set of lecture notes are prepared for the course “Statistik 2” at the University of Copenhagen. It is a revised version of notes prepared in connection with a series of lectures at the Swedish summerschool in Särö, June 11–17, 1979. The notes do by no means give a complete account of the ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
The present set of lecture notes are prepared for the course “Statistik 2” at the University of Copenhagen. It is a revised version of notes prepared in connection with a series of lectures at the Swedish summerschool in Särö, June 11–17, 1979. The notes do by no means give a complete account of the theory of contingency tables. They are based on the idea that the graph theoretic methods in Darroch, Lauritzen and Speed (1978) can be used directly to develop this theory and, hopefully, with some pedagogical advantages. My thanks are due to the audience at the Swedish summerschool for patiently listening to the first version of these lectures, to Joseph Verducci, Stanford, who read the manuscript and suggested many improvements and corrections, and to Ursula Hansen, who typed the manuscript.
Approximation of Bayesian networks through edge removals
, 1993
"... Due to the general inherent complexity of inference in Bayesian networks, the need to compromise the exactitude of inference arises frequently. A scheme for reduction of complexity by enforcing additional conditional independences is investigated. The enforcement of independences is achieved through ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Due to the general inherent complexity of inference in Bayesian networks, the need to compromise the exactitude of inference arises frequently. A scheme for reduction of complexity by enforcing additional conditional independences is investigated. The enforcement of independences is achieved through edge removals in a triangulated graph. The removal of a single edge may imply an enormous reduction of complexity, since other edges may become superuous by its removal. The approximation scheme presented has several appealing features. Most notably among these, a bound on the overall approximation error can be computed locally, the bound on the error by a series of approximations equals the sum of the bounds of the errors of the individual approximations, and the influence of an approximation attenuates with increasing `distance' from edge removed. The scheme compares in some cases very favorably with the approximation method suggested by Jensen & Andersen (1990).
Sequences of regressions and their independences
, 2012
"... Ordered sequences of univariate or multivariate regressions provide statistical modelsfor analysingdata fromrandomized, possiblysequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Ordered sequences of univariate or multivariate regressions provide statistical modelsfor analysingdata fromrandomized, possiblysequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we name regression graphs, provided the generated distribution shares some properties with a joint Gaussian distribution. Regression graphs extend purely directed, acyclic graphs by two types of undirected graph, one type for components of joint responses and the other for components of the context vector variable. We review the special features and the history of regression graphs, prove criteria for Markov equivalence anddiscussthenotion of simpler statistical covering models. Knowledgeof Markov equivalence provides alternative interpretations of a given sequence of regressions, is essential for machine learning strategies and permits to use the simple graphical criteria of regression graphs on graphs for which the corresponding criteria are in general more complex. Under the known conditions that a Markov equivalent directed acyclic graph exists for any given regression graph, we give a polynomial time algorithm to find one such graph.
A Note on Multivariate Logistic Models for Contingency Tables
 Austral. J. Statist
, 1997
"... Loglinear models are a widely accepted tool for modeling discrete data given in a contingency table. Although their parameters reflect the interaction structure in the joint distribution of all variables, they do not give information about structures appearing in the margins of the table. This is i ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Loglinear models are a widely accepted tool for modeling discrete data given in a contingency table. Although their parameters reflect the interaction structure in the joint distribution of all variables, they do not give information about structures appearing in the margins of the table. This is in contrast to multivariate logistic parameters recently introduced by Glonek & McCullagh (1995). They have as parameters the highest order log odds ratios derived from the joint table and from each marginal table. The link between the cell probabilities and the multivariate logistic parameters is given in Glonek & McCullagh in an algebraic fashion. In this paper we focus on this link, showing that it is derived by general parameter transformations in exponential families. In particular, the connection between the natural, the expectation and the mixed parameterization in exponential families (BarndorffNielsen, 1978) is used. This also yields the derivatives of the likelihood equation and shows properties of the Fisher matrix. Further emphasis is paid to the analysis of independence hypotheses in margins of a contingency table.
Graphical models for inference under outcomedependent sampling
 STAT SCI 2010;25:368–87
, 2010
"... We consider situations where data have been collected such that the sampling depends on the outcome of interest and possibly further covariates, as for instance in casecontrol studies. Graphical models represent assumptions about the conditional independencies among the variables. By including a no ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We consider situations where data have been collected such that the sampling depends on the outcome of interest and possibly further covariates, as for instance in casecontrol studies. Graphical models represent assumptions about the conditional independencies among the variables. By including a node for the sampling indicator, assumptions about sampling processes can be made explicit. We demonstrate how to read off such graphs whether consistent estimation of the association between exposure and outcome is possible. Moreover, we give sufficient graphical conditions for testing and estimating the causal effect of exposure on outcome. The practical use is illustrated with a number of examples.
BIFROST  Block recursive models Induced From Relevant knowledge, Observations, and Statistical Techniques
 Computational Statistics and Data Analysis
, 1993
"... The theoretical background for a program for establishing expert systems on the basis of observations and expert knowledge is presented. Block recursive models form the basis of the statistical modelling. These models, together with various model selection methods for automatic model selection, a ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The theoretical background for a program for establishing expert systems on the basis of observations and expert knowledge is presented. Block recursive models form the basis of the statistical modelling. These models, together with various model selection methods for automatic model selection, are presented. Additionally, the connection between a block recursive model and expert systems based on causal probabilistic networks is treated. A medical example concerning diagnosis of coronary artery disease forms the basis for an evaluation of the expert systems established. Keywords: causal probabilistic networks, graphical association models, machine learning, model selection, selection criteria, selection strategies. 1 Introduction BIFROST is a program for semiautomatic knowledge acquisition and is a continuation developments made in (Greve, Hjsgaard, Skjth and Thiesson 1990). The objective is to obtain preliminary causal models for use in the HUGIN expert system shell (Ander...