## Embedded Bayesian Network Classifiers (1997)

Citations: | 8 - 1 self |

### BibTeX

@TECHREPORT{Heckerman97embeddedbayesian,

author = {David Heckerman and Christopher Meek},

title = {Embedded Bayesian Network Classifiers},

institution = {},

year = {1997}

}

### Years of Citing Articles

### OpenURL

### Abstract

Low-dimensional probability models for local distribution functions in a Bayesian network include decision trees, decision graphs, and causal independence models. We describe a new probability model for discrete Bayesian networks, which we call an embedded Bayesian network classifier or EBNC. The model for a node Y given parents X is obtained from a (usually different) Bayesian network for Y and X in which X need not be the parents of Y . We show that an EBNC is a special case of a softmax polynomial regression model. Also, we show how to identify a non-redundant set of parameters for an EBNC, and describe an asymptotic approximation for learning the structure of Bayesian networks that contain EBNCs. Unlike the decision tree, decision graph, and causal independence models, we are unaware of a semantic justification for the use of these models. Experiments are needed to determine whether the models presented in this paper are useful in practice. Keywords: Bayesian networks, model dimen...

### Citations

5289 | Neural Networks for Pattern Recognition - Bishop - 1995 |

2685 | Estimating the dimension of a model - Schwarz - 1978 |

1512 | M.Wright "Practical optimization - Gill - 1981 |

179 | A Bayesian approach to learning Bayesian networks with local structure
- Chickering, Heckerman, et al.
- 1997
(Show Context)
Citation Context ...number of states) are modeled with a small number of parameters. Such parsimonious models include decision trees, decision graphs, and causal-independence models (e.g., Friedman and Goldszmidt, 1996; =-=Chickering et al., 1997-=-; Meek and Heckerman, 1997). In this paper, we introduce another parsimonious model for Bayesian networks in which each variable has a finite number of states, known as an embedded Bayesian network cl... |

142 | A Reference Bayesian Test for Nested Hypotheses and Its Relationship to the Schwarz Criterion - Kass, Wasserman - 1995 |

79 | Building Classifiers Using Bayesian Networks
- Friedman, Goldszmidt
- 1996
(Show Context)
Citation Context ...ch all variables have a finite number of states) are modeled with a small number of parameters. Such parsimonious models include decision trees, decision graphs, and causal-independence models (e.g., =-=Friedman and Goldszmidt, 1996-=-; Chickering et al., 1997; Meek and Heckerman, 1997). In this paper, we introduce another parsimonious model for Bayesian networks in which each variable has a finite number of states, known as an emb... |

64 | On the choice of a model to fit data from an exponential family, The Annals of Statistics - Haughton - 1988 |

40 | Stochastic Complexity (with discussion - Rissanen - 1987 |

30 | Computing second derivatives in feed-forward network: A review - Buntine, Weigend - 1994 |

24 | Models and selection criteria for regression and classi
- Heckerman, Meek
- 1997
(Show Context)
Citation Context ...raph, and causal independence models, we are unaware of a semantic justification for the use of these models. In fact, there are theoretical reasons that suggest the use of EBNCs may be unreasonable (=-=Heckerman and Meek, 1997-=-). Experiments are needed to determine whether the models presented in this paper are useful in practice. The terminology and notation we need is as follows. We denote a variable by an uppercase lette... |

19 |
Structure and parameter learning for causal independence and causal interaction models
- Meek, Heckerman
- 1997
(Show Context)
Citation Context ...eled with a small number of parameters. Such parsimonious models include decision trees, decision graphs, and causal-independence models (e.g., Friedman and Goldszmidt, 1996; Chickering et al., 1997; =-=Meek and Heckerman, 1997-=-). In this paper, we introduce another parsimonious model for Bayesian networks in which each variable has a finite number of states, known as an embedded Bayesian network classifier or EBNC. The mode... |