• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

DMCA

Probabilistic Inference Using Markov Chain Monte Carlo Methods (1993)

Cached

  • Download as a PDF

Download Links

  • [www.cs.princeton.edu]
  • [www.cs.princeton.edu]
  • [omega.albany.edu:8008]
  • [personalrobotics.ri.cmu.edu]
  • [varsha.ece.ogi.edu]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Radford M. Neal
Citations:736 - 24 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Neal93probabilisticinference,
    author = {Radford M. Neal},
    title = {Probabilistic Inference Using Markov Chain Monte Carlo Methods},
    year = {1993}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over high-dimensional spaces. Related problems in other fields have been tackled using Monte Carlo methods based on sampling using Markov chains, providing a rich array of techniques that can be applied to problems in artificial intelligence. The "Metropolis algorithm" has been used to solve difficult problems in statistical physics for over forty years, and, in the last few years, the related method of "Gibbs sampling" has been applied to problems of statistical inference. Concurrently, an alternative method for solving problems in statistical physics by means of dynamical simulation has been developed as well, and has recently been unified with the Metropolis algorithm to produce the "hybrid Monte Carlo" method. In computer science, Markov chain sampling is the basis of the heuristic optimization technique of "simulated annealing", and has recently been used in randomized algorithms for approximate counting of large sets. In this review, I outline the role of probabilistic inference in artificial intelligence, present the theory of Markov chains, and describe various Markov chain Monte Carlo algorithms, along with a number of supporting techniques. I try to present a comprehensive picture of the range of methods that have been developed, including techniques from the varied literature that have not yet seen wide application in artificial intelligence, but which appear relevant. As illustrative examples, I use the problems of probabilistic inference in expert systems, discovery of latent classes from data, and Bayesian learning for neural networks.

Keyphrases

artificial intelligence    probabilistic inference    metropolis algorithm    markov chain    statistical physic    varied literature    related method    high-dimensional space    alternative method    approximate counting    markov chain sampling    randomized algorithm    last year    heuristic optimization technique    dynamical simulation    attractive approach    neural network    forty year    monte carlo method    computer science    hybrid monte carlo method    computational difficulty    difficult problem    complex distribution    latent class    rich array    wide application    large set    necessary realism    bayesian learning    illustrative example    expert system    probabilistic model    empirical learning    comprehensive picture    statistical inference   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University