## Iterative conditional fitting for Gaussian ancestral graph models (2004)

### Cached

### Download Links

Venue: | In M. Chickering and J. Halpern (Eds.), Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence |

Citations: | 20 - 6 self |

### BibTeX

@INPROCEEDINGS{Drton04iterativeconditional,

author = {Mathias Drton},

title = {Iterative conditional fitting for Gaussian ancestral graph models},

booktitle = {In M. Chickering and J. Halpern (Eds.), Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence},

year = {2004},

pages = {130--137},

publisher = {Morgan Kaufmann}

}

### Years of Citing Articles

### OpenURL

### Abstract

Ancestral graph models, introduced by Richardson and Spirtes (2002), generalize both Markov random fields and Bayesian networks to a class of graphs with a global Markov property that is closed under conditioning and marginalization. By design, ancestral graphs encode precisely the conditional independence structures that can arise from Bayesian networks with selection and unobserved (hidden/latent) variables. Thus, ancestral graph models provide a potentially very useful framework for exploratory model selection when unobserved variables might be involved in the data-generating process but no particular hidden structure can be specified. In this paper, we present the Iterative Conditional Fitting (ICF) algorithm for maximum likelihood estimation in Gaussian ancestral graph models. The name reflects that in each step of the procedure a conditional distribution is estimated, subject to constraints, while a marginal distribution is held fixed. This approach is in duality to the well-known Iterative Proportional Fitting algorithm, in which marginal distributions are fitted while conditional distributions are held fixed. 1

### Citations

7440 | Probabilistie Reasoning in Intelligent Systems: Networks of Plausible Inference - Pearl - 1988 |

1155 | Graphical Models
- Lauritzen
- 1996
(Show Context)
Citation Context ...-(5), we move on to the next vertex in V \ unG. The procedure is continued until convergence. 5.2 CONVERGENCE It is easy to see that this ICF algorithm is an iterative partial maximization algorithm (=-=Lauritzen 1996-=-, App. A.4) since in the i-th step we maximize the conditional likelihood L(B, Ω) from (9) over the section in the parameter space defined by fixing the parameters Ω−i,−i, and B j,pa(j), j �= i. The s... |

1125 |
R: A language for data analysis and graphics
- Ihaka, Gentleman
- 1996
(Show Context)
Citation Context ...ω34 ❅ ❅ ❞ ❞ ❅ ❞ 1 Z2 ❞ ❞ β21 � ω23 � � ✲ ��✠ 1 2 ❞ ❞ 1 2 Z3 Z3 Z4 4 ❞ ❞ ❅ ω34 ❅ ❅ ✲❅❘ � β42 Figure 5: Illustration of the ICF update steps. 6 AN IMPLEMENTATION The statistical programming language R (=-=Ihaka and Gentleman 1996-=-) provides a freeware environment for programming in interpreted code building on a large number of available routines. The team of devel4sopers of R provided a framework for writing extension librari... |

987 | On the statistical analysis of dirty pictures - Besag - 1986 |

465 |
Graphical models in applied multivariate statistics
- Whittaker
- 1990
(Show Context)
Citation Context ...e a conditional distribution is estimated, subject to constraints, while a marginal distribution is held fixed. This approach is in duality to the well-known iterative proportional fitting algorithm (=-=Whittaker 1990-=-, pp. 182–185), in the steps of which a marginal distribution is fitted for a fixed conditional distribution. This paper is organized as follows. In §§2 and 3 we define ancestral graphs and their glob... |

99 |
Econometric Theory
- Goldberger
- 1964
(Show Context)
Citation Context ... point must give the same value. 5.3 APPLYING ICF TO DAGS It is well known that the MLE of the parameters of a Gaussian DAG model can be found by carrying out a finite number of regressions (see e.g. =-=Goldberger 1964-=-, or Andersson and Perlman 1998). DAG models form a special case of ancestral graph models so we can also apply ICF to a Gaussian DAG model. If the graph G is a DAG then sp(i) = ∅ for all i ∈ V . Ther... |

79 | Ancestral graph Markov models - Richardson, Spirtes |

51 | Alternative Markov Properties for Chain Graphs
- ANDERSSON, MADIGAN, et al.
- 2001
(Show Context)
Citation Context ...have found wide-spread application. Well-known generalizations of both undirected graph models and DAG models are the chain graph models, which can be equipped with two alternative Markov properties (=-=Andersson et al. 2001-=-). A different generalization is obtained from ancestral graphs, introduced by Richardson and Spirtes (2002) = RS (2002). Whereas chain graphs allow both undirected and directed edges, anThomas S. Ric... |

39 | Model selection for Gaussian concentration graphs. Biometrika 91 - Drton, Perlman - 2004 |

23 | Multimodality of the likelihood in the bivariate seemingly unrelated regressions model
- Drton, Richardson
- 2004
(Show Context)
Citation Context ...e constant, takes the form ℓ(Σ) = − n n log |Σ| − 2 2 tr{Σ−1S}. (7) Positive definiteness of S guarantees the existence of the global maximum of ℓ(Σ) over P(G) but there may be multiple local maxima (=-=Drton and Richardson 2004-=-). 4.3 EMPLOYING THE DECOMPOSITION OF AN ANCESTRAL GRAPH As described in RS (2002, §8.5), the decomposition of an ancestral graph G into an undirected and a directed-bidirected part is accompanied by ... |

16 | Probability measures with given marginals and conditionals: I-projections and conditional iterative proportional fitting. Statistics & Decisions 18:311–329 - Cramer - 2000 |

15 | Normal linear regression models with recursive graphical Markov structure
- Andersson, Perlman
- 1998
(Show Context)
Citation Context ... same value. 5.3 APPLYING ICF TO DAGS It is well known that the MLE of the parameters of a Gaussian DAG model can be found by carrying out a finite number of regressions (see e.g. Goldberger 1964, or =-=Andersson and Perlman 1998-=-). DAG models form a special case of ancestral graph models so we can also apply ICF to a Gaussian DAG model. If the graph G is a DAG then sp(i) = ∅ for all i ∈ V . Therefore, the conditional distribu... |

14 | A new algorithm for maximum likelihood estimation in Gaussian graphical models for marginal independence
- Drton, Richardson
- 2003
(Show Context)
Citation Context ... ICF algorithm depend on the graph itself, it is important to work out which graph in a whole class of Markov equivalent graphs allows for the most efficient fitting of the associated model (see also =-=Drton and Richardson 2003-=-, §4.2.4). Finally, ICF has the nice feature that its main idea of decomposing the complicated overall maximization problem into a sequence of simpler optimization problems seems also promising for th... |

14 | Iterative Estimation of a Set of Linear Regression Equations - Telser - 1964 |

10 |
Introduction to graphical modeling. Second edition
- Edwards
- 2000
(Show Context)
Citation Context ...freedom using the asymptotic distribution of the deviance as χ 2 df suggests a rather poor model fit. In fact, the mathematics marks data are a notorious example for undirected graph models (see e.g. =-=Edwards 2000-=-, Whittaker 1990), so we are not surprised that our toy data are not well described by the Gaussian ancestral graph model. 7 CONCLUSION We have presented ICF = iterative conditional fitting, which is ... |

5 | Conditional iterative proportional fitting for Gaussian distributions - Cramer - 1998 |

3 | Spirtes (2002). Ancestral graph Markov models - Richardson, P |

1 |
gRaphical Models in R: A new initiative within the R project
- Lauritzen
- 2002
(Show Context)
Citation Context ...ted code building on a large number of available routines. The team of devel4sopers of R provided a framework for writing extension libraries for R. As part of the “graphical models in R” initiative (=-=Lauritzen 2002-=-), Marchetti and Drton developed a function library called ‘ggm’, which implements functions for fitting Gaussian graphical models and, in particular, provides an implementation of ICF. The package ca... |