## Population Markov Chain Monte Carlo (2003)

### Cached

### Download Links

- [ite.gmu.edu]
- [www.cs.bham.ac.uk]
- [www.ics.uci.edu]
- [ite.gmu.edu]
- [ite.gmu.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | Machine Learning |

Citations: | 12 - 2 self |

### BibTeX

@INPROCEEDINGS{Laskey03populationmarkov,

author = {Kathryn Blackmond Laskey and James Myers},

title = {Population Markov Chain Monte Carlo},

booktitle = {Machine Learning},

year = {2003},

pages = {175--196},

publisher = {University Press}

}

### OpenURL

### Abstract

Stochastic search algorithms inspired by physical and biological systems are applied to the problem of learning directed graphical probability models in the presence of missing observations and hidden variables. For this class of problems, deterministic search algorithms tend to halt at local optima, requiring random restarts to obtain solutions of acceptable quality. We compare three stochastic search algorithms: a Metropolis-Hastings Sampler (MHS), an Evolutionary Algorithm (EA), and a new hybrid algorithm called Population Markov Chain Monte Carlo, or popMCMC. PopMCMC uses statistical information from a population of MHSs to inform the proposal distributions for individual samplers in the population. Experimental results show that popMCMC and EAs learn more efficiently than the MHS with no information exchange. Populations of MCMC samplers exhibit more diversity than populations evolving according to EAs not satisfying physics-inspired local reversibility conditions. KEY WORDS: Markov Chain Monte Carlo, Metropolis-Hastings Algorithm, Graphical Probabilistic Models, Bayesian Networks, Bayesian Learning, Evolutionary Algorithms Machine Learning MCMC Issue 1 5/16/01 1.