## Model-Independent Mean Field Theory as a Local Method for Approximate Propagation of Information (2002)

### Cached

### Download Links

Venue: | Propagation of Information,” Computation in Neural Systems |

Citations: | 16 - 1 self |

### BibTeX

@INPROCEEDINGS{Haft02model-independentmean,

author = {M. Haft and R. Hofmann and V. Tresp},

title = {Model-Independent Mean Field Theory as a Local Method for Approximate Propagation of Information},

booktitle = {Propagation of Information,” Computation in Neural Systems},

year = {2002},

pages = {93--105}

}

### Years of Citing Articles

### OpenURL

### Abstract

We present a systematic approach to mean field theory (MFT) in a general probabilistic setting without assuming a particular model. The mean field equations derived here may serve as a local and thus very simple method for approximate inference in probabilistic models such as Boltzmann machines or Bayesian networks. "Model-independent" means that we do not assume a particular type of dependencies; in a Bayesian network, for example, we allow arbitrary tables to specify conditional dependencies. In general, there are multiple solutions to the mean field equations. We show that improved estimates can be obtained by forming a weighted mixture of the multiple mean field solutions. Simple approximate expressions for the mixture weights are given. The general formalism derived so far is evaluated for the special case of Bayesian networks. The benefits of taking into account multiple solutions are demonstrated by using MFT for inference in a small and in a very large Bayesian network. The results are compared to the exact results.

### Citations

9134 |
Elements of Information Theory
- Cover, Thomas
- 1991
(Show Context)
Citation Context ...ying these constraints-- it is `as close as possible' to the given untractable distribution P (X). As a measure of distance between P (X) and Q(X) we use the cross entropy (Kullback-Leibler distance) =-=[18]-=- D(Q#P ) = # x#H Q(x) log Q(x) P (x) # # log Q(X) P (X) # Q(X) . (1) Note, that this distance is not symmetric in P and Q and that, with even more justication, we might have used D(P#Q) = # x#H P (x) ... |

7440 |
Probabilistie Reasoning in Intelligent Systems: Networks of Plausible Inference
- Pearl
- 1988
(Show Context)
Citation Context ... using a probabilistic setting in many applied elds where uncertainty plays a prominent role --such as image processing, neural networks and arti cial intelligence-- have become increasingly apparent =-=[1]-=-. Unfortunately, probabilistic solutions often require involved computation [2] and further progress is closely related to the development of methods for the efcient handling of probability distributi... |

1907 |
Introduction to the theory of neural computation
- Hertz, Krogh, et al.
- 1991
(Show Context)
Citation Context ...nteracting particles. Many different facets of MFT can be found in elds as different as relativistic nuclear physics [19, 20], statistical physics [3, 4, 21], image processing [7] and neural networks =-=[22, 23, 24, 8]-=-. As a consequence, there exist a number of ways to derive mean eld equations. Following the above discussion we dene as mean eld approximation the distribution Q(X) which is closest to P (X) using di... |

1342 |
Local computations with probabilities on graphical structures and their application to expert systems
- Lauritzen, Spiegelhalter
- 1988
(Show Context)
Citation Context ...f applications in other areas as well [5, 6, 7, 8]. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertainty in dependency models =-=[1, 9, 10, 11]-=-. The use of MFT in the context of graphical models was pioneered by Jordan, Saul and Jaakola [12, 13]. In our paper we develop this approach in two new directions. First, in contrast to previous work... |

969 |
An introduction to Bayesian Networks
- Jensen
- 1996
(Show Context)
Citation Context ...our results (the mean eld equations (11) and the mixture weights (15)) are very general. We will now focus on a particular parameterization of a probability distribution, namely, on Bayesian networks =-=[1, 25]-=-. A Bayesian network has an expansion of the form P (X) = # i P (X i |X 1 , . . . X i-1 ) = # i P (X i |# i ), (16) where in a typical Bayesian network every variable X i has only a small set of `pare... |

604 | The computational complexity of probabilistic inference using Bayesian belief network - Cooper - 1990 |

465 |
Graphical models in applied multivariate statistics
- Whittaker
- 1990
(Show Context)
Citation Context ...f applications in other areas as well [5, 6, 7, 8]. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertainty in dependency models =-=[1, 9, 10, 11]-=-. The use of MFT in the context of graphical models was pioneered by Jordan, Saul and Jaakola [12, 13]. In our paper we develop this approach in two new directions. First, in contrast to previous work... |

293 |
Bayesian updating in causal probabilistic networks by local computations
- Jensen, Lauritzen, et al.
- 1990
(Show Context)
Citation Context ...f applications in other areas as well [5, 6, 7, 8]. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertainty in dependency models =-=[1, 9, 10, 11]-=-. The use of MFT in the context of graphical models was pioneered by Jordan, Saul and Jaakola [12, 13]. In our paper we develop this approach in two new directions. First, in contrast to previous work... |

249 |
Statistical Field Theory
- Parisi
- 1988
(Show Context)
Citation Context ...er is to extend the concept of using mean eld theory (MFT) as a systematic approach for approximating probability distributions. MFT is widely used in physics, in particular, in statistical mechanics =-=[3, 4]-=- and has found a number of applications in other areas as well [5, 6, 7, 8]. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertai... |

203 | Pairwise data clustering by deterministic annealing
- Hofmann, Buhmann
- 1997
(Show Context)
Citation Context ...matic approach for approximating probability distributions. MFT is widely used in physics, in particular, in statistical mechanics [3, 4] and has found a number of applications in other areas as well =-=[5, 6, 7, 8]-=-. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertainty in dependency models [1, 9, 10, 11]. The use of MFT in the context of g... |

149 |
A mean-field theory learning algorithm for neural networks
- Peterson, Anderson
- 1987
(Show Context)
Citation Context ...matic approach for approximating probability distributions. MFT is widely used in physics, in particular, in statistical mechanics [3, 4] and has found a number of applications in other areas as well =-=[5, 6, 7, 8]-=-. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertainty in dependency models [1, 9, 10, 11]. The use of MFT in the context of g... |

126 | Mean field theory for sigmoid belief networks
- Saul, Jaakkola, et al.
- 1996
(Show Context)
Citation Context ...cal models, which are a general framework for dealing with uncertainty in dependency models [1, 9, 10, 11]. The use of MFT in the context of graphical models was pioneered by Jordan, Saul and Jaakola =-=[12, 13]-=-. In our paper we develop this approach in two new directions. First, in contrast to previous work we develop a systematic approach to MFT without reference to a particular model but instead work in a... |

106 |
The theory of critical phenomena
- Binney, Dowrick, et al.
- 1986
(Show Context)
Citation Context ...er is to extend the concept of using mean eld theory (MFT) as a systematic approach for approximating probability distributions. MFT is widely used in physics, in particular, in statistical mechanics =-=[3, 4]-=- and has found a number of applications in other areas as well [5, 6, 7, 8]. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertai... |

98 |
Colloquium on spin glasses
- BINDER, KINZEL, et al.
- 1983
(Show Context)
Citation Context ...cal physics and is used to describe systems of many interacting particles. Many different facets of MFT can be found in elds as different as relativistic nuclear physics [19, 20], statistical physics =-=[3, 4, 21]-=-, image processing [7] and neural networks [22, 23, 24, 8]. As a consequence, there exist a number of ways to derive mean eld equations. Following the above discussion we dene as mean eld approximatio... |

41 | Improving the mean field approximation via the use of mixture distributions
- Jaakkola, Jordan
- 1997
(Show Context)
Citation Context ... new contribution of this paper is to address the problem of multiple solutions of the mean field equations. Coping with multiple solutions has been originally discussed in [14] and simultaneously in =-=[15, 16, 17]-=-. We show that in the case of multiple solutions, a weighted mixture of these solutions leads to reasonable estimates of expected values. Approximate and very plausible mixing parameters are derived. ... |

38 |
Explorations of the Mean Field Theory Learning Algorithm
- Peterson, Hartman
- 1989
(Show Context)
Citation Context ...matic approach for approximating probability distributions. MFT is widely used in physics, in particular, in statistical mechanics [3, 4] and has found a number of applications in other areas as well =-=[5, 6, 7, 8]-=-. We present MFT in a generic way in the context of graphical models, which are a general framework for dealing with uncertainty in dependency models [1, 9, 10, 11]. The use of MFT in the context of g... |

30 | Approximating posterior distributions in belief networks using mixtures
- Bishop, Lawrence, et al.
- 1998
(Show Context)
Citation Context ...nd new contribution of this paper is to address the problem of multiple solutions of the mean eld equations. Coping with multiple solutions has been originally discussed in [14] and simultaneously in =-=[15, 16, 17]-=-. We show that in the case of multiple solutions, a weighted mixture of these solutions leads to reasonable estimates of expected values. Approximate and very plausible mixing parameters are derived. ... |

17 | Variational learning in nonlinear Gaussian belief networks
- Frey, Hinton
- 1999
(Show Context)
Citation Context ...model-independent in so far as you may use the resulting mean eld equation (7) in arbitrary probabilistic domains. The only restriction is that the variables have to be descrete. (See Frey and Hinton =-=[27]-=- for an example of mean eld theory in the case of continuous hidden variables.) As illustrated in our experiments, our approach can be used for approximate propagation of evidence (inference). Thereby... |

10 |
Nonlinear neural networks
- Hemmen, Kühn
- 1986
(Show Context)
Citation Context ...nteracting particles. Many different facets of MFT can be found in elds as different as relativistic nuclear physics [19, 20], statistical physics [3, 4, 21], image processing [7] and neural networks =-=[22, 23, 24, 8]-=-. As a consequence, there exist a number of ways to derive mean eld equations. Following the above discussion we dene as mean eld approximation the distribution Q(X) which is closest to P (X) using di... |

8 |
A mean eld theory learning algorithm for neural networks
- Peterson, Anderson
- 1987
(Show Context)
Citation Context |

7 |
Robust "topological" codes by keeping control of internal redundancy
- Haft
- 1998
(Show Context)
Citation Context |

7 | Mixture representations for inference and learning in boltzmann machines
- Lawrence, Bishop, et al.
- 1998
(Show Context)
Citation Context ...nd new contribution of this paper is to address the problem of multiple solutions of the mean eld equations. Coping with multiple solutions has been originally discussed in [14] and simultaneously in =-=[15, 16, 17]-=-. We show that in the case of multiple solutions, a weighted mixture of these solutions leads to reasonable estimates of expected values. Approximate and very plausible mixing parameters are derived. ... |

4 |
Explorations of the mean-®eld theory learning algorithm
- Peterson, Hartman
- 1989
(Show Context)
Citation Context |

4 |
Graphical Models in Applied Multivariate Statistics
- Wittaker
- 1990
(Show Context)
Citation Context |

4 |
Improving the mean eld approximation via the use of mixture distributions
- Jaakkola, Jordan
- 1998
(Show Context)
Citation Context ...nd new contribution of this paper is to address the problem of multiple solutions of the mean eld equations. Coping with multiple solutions has been originally discussed in [14] and simultaneously in =-=[15, 16, 17]-=-. We show that in the case of multiple solutions, a weighted mixture of these solutions leads to reasonable estimates of expected values. Approximate and very plausible mixing parameters are derived. ... |

3 |
Collective phenomena in neural networks
- Hemmen, Kuhn
- 1991
(Show Context)
Citation Context ...nteracting particles. Many different facets of MFT can be found in elds as different as relativistic nuclear physics [19, 20], statistical physics [3, 4, 21], image processing [7] and neural networks =-=[22, 23, 24, 8]-=-. As a consequence, there exist a number of ways to derive mean eld equations. Following the above discussion we dene as mean eld approximation the distribution Q(X) which is closest to P (X) using di... |

2 |
Exploiting tractrable substructures in untractable networks
- Saul, Jordan
- 1995
(Show Context)
Citation Context ...cal models, which are a general framework for dealing with uncertainty in dependency models [1, 9, 10, 11]. The use of MFT in the context of graphical models was pioneered by Jordan, Saul and Jaakola =-=[12, 13]-=-. In our paper we develop this approach in two new directions. First, in contrast to previous work we develop a systematic approach to MFT without reference to a particular model but instead work in a... |

2 |
Exploiting tractable substructures in untractable networks
- Saul, Jordan
- 1995
(Show Context)
Citation Context ...cal models, which are a general framework for dealing with uncertainty in dependency models [1, 9, 10, 11]. The use of MFT in the context of graphical models was pioneered by Jordan, Saul and Jaakola =-=[12, 13]-=-. In our paper we develop this approach in two new directions. First, in contrast to previous work we develop a systematic approach to MFT without reference to a particular model but instead work in a... |

2 |
Relativistic mean field calculations of - and
- Glendenning, von-Eiff, et al.
- 1993
(Show Context)
Citation Context ...T is a concept from theoretical physics and is used to describe systems of many interacting particles. Many different facets of MFT can be found in fields as different as relativistic nuclear physics =-=[19, 20]-=-, statistical physics [3, 4, 21], image processing [7] and neural networks [22, 23, 24, 8]. As a consequence, there exist a number of ways to derive mean field equations. Following the above discussio... |

1 |
Mean eld theory for signoide belief networks
- Saul, Jaakkola, et al.
- 1996
(Show Context)
Citation Context |

1 |
Model-independent mean eld theory as a local method for approximate propagation of information
- Haft, Hofmann, et al.
- 1997
(Show Context)
Citation Context ...teracting modules. The second new contribution of this paper is to address the problem of multiple solutions of the mean eld equations. Coping with multiple solutions has been originally discussed in =-=[14]-=- and simultaneously in [15, 16, 17]. We show that in the case of multiple solutions, a weighted mixture of these solutions leads to reasonable estimates of expected values. Approximate and very plausi... |

1 |
Relativistic mean eld calculations of #- and #-hypernuclei
- Glendenning, von-Eiff, et al.
- 1993
(Show Context)
Citation Context ...MFT is a concept from theoretical physics and is used to describe systems of many interacting particles. Many different facets of MFT can be found in elds as different as relativistic nuclear physics =-=[19, 20]-=-, statistical physics [3, 4, 21], image processing [7] and neural networks [22, 23, 24, 8]. As a consequence, there exist a number of ways to derive mean eld equations. Following the above discussion ... |

1 |
Dening structure and conditional probabilities of Bayesian networks in graphical form
- Hofmann, Haft
- 1999
(Show Context)
Citation Context ...a, red eyes, fever, sore throat and red tongue), that is, from cause to effect. Arces point from the positiv nding of a desease to that state of a symptom which is typically present given the desease =-=[26]-=-. Note, that the variables are not just binary. Plausible values for the conditional probabilities of that network have been estimated by consulting a text book on children's diseases. network (namely... |

1 |
Relativistic mean £eld calculations of Λ- and
- Glendenning, von-Eiff, et al.
- 1993
(Show Context)
Citation Context ...FT is a concept from theoretical physics and is used to describe systems of many interacting particles. Many different facets of MFT can be found in £elds as different as relativistic nuclear physics =-=[19, 20]-=-, statistical physics [3, 4, 21], image processing [7] and neural networks [22, 23, 24, 8]. As a consequence, there exist a number of ways to derive mean £eld equations. Following the above discussion... |