@TECHREPORT{Hofmann00inferencein, author = {Reimar Hofmann}, title = {Inference in Markov Blanket Networks}, institution = {}, year = {2000} }

Bookmark

OpenURL

Abstract

Bayesian networks have been successfully used to model joint probabilities in many cases. When dealing with continuous variables and nonlinear relationships neural networks can be used to model conditional densities as part of a Bayesian network. However, doing inference can then be computationally expensive. Also, information is implicitly passed backwards through neural networks, i.e. from their output to the input. Used in this "inverse" mode neural networks often perform suboptimal. We suggest a different type of model called Markov blanket model (MBM). Here the neural networks are used in the forward direction only. This gives advantages in speed and guarantees to match the performance of the underlying neural network on complete data. 1 Introduction Bayes nets (e.g. Heckerman (1995)) are models of the joint probability distribution of a set of variables fx i g N i=1 of the form p(x) = N Y i=1 p(x i jP i ): (1) where P i ` fx 1 ; : : : ; x i\Gamma1 g are the par...

... constructed by connecting each variable to all Members of its Markov blanket 5 . Since this is the original Markov network this shows that all independencies in N were also expressed by the MBM. In (=-=Hofmann and Tresp 1998-=-) the relationship between MBMs and Markov networks is exploited to perform structural learning of Markov networks for nonlinear domains based on a MBM representation. 2.2 The Linear Case Assume we bu...