Results 1  10
of
65
Probabilistic Inference Using Markov Chain Monte Carlo Methods
, 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difculties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over highdimensional spaces. Rel ..."
Abstract

Cited by 567 (20 self)
 Add to MetaCart
Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difculties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over highdimensional spaces. Related problems in other fields have been tackled using Monte Carlo methods based on sampling using Markov chains, providing a rich array of techniques that can be applied to problems in artificial intelligence. The "Metropolis algorithm" has been used to solve difficult problems in statistical physics for over forty years, and, in the last few years, the related method of "Gibbs sampling" has been applied to problems of statistical inference. Concurrently, an alternative method for solving problems in statistical physics by means of dynamical simulation has been developed as well, and has recently been unified with the Metropolis algorithm to produce the "hybrid Monte Carlo" method. In computer science, Markov chain sampling is the basis of the heuristic optimization technique of "simulated annealing", and has recently been used in randomized algorithms for approximate counting of large sets. In this review, I outline the role of probabilistic inference in artificial intelligence, and present the theory of Markov chains, and describe various Markov chain Monte Carlo algorithms, along with a number of supporting techniques. I try to present a comprehensive picture of the range of methods that have been developed, including techniques from the varied literature that have not yet seen wide application in artificial intelligence, but which appear relevant. As illustrative examples, I use the problems of probabilitic inference in expert systems, discovery of latent classes from data, and Bayesian learning for neural networks.
From Laplace To Supernova Sn 1987a: Bayesian Inference In Astrophysics
, 1990
"... . The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
. The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions to wellposed statistical problems, and is historically the original approach to statistics. The reasons for earlier rejection of Bayesian methods are discussed, and it is noted that the work of Cox, Jaynes, and others answers earlier objections, giving Bayesian inference a firm logical and mathematical foundation as the correct mathematical language for quantifying uncertainty. The Bayesian approaches to parameter estimation and model comparison are outlined and illustrated by application to a simple problem based on the gaussian distribution. As further illustrations of the Bayesian paradigm, Bayesian solutions to two interesting astrophysical problems are outlined: the measurement of wea...
Bayesian Treed Gaussian Process Models with an Application to Computer Modeling
 Journal of the American Statistical Association
, 2007
"... This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian proce ..."
Abstract

Cited by 44 (15 self)
 Add to MetaCart
This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian processes and simple linear models can yield a more parsimonious spatial model while significantly reducing computational effort. The methodological developments and statistical computing details which make this approach efficient are described in detail. Illustrations of our model are given for both synthetic and real datasets. Key words: recursive partitioning, nonstationary spatial model, nonparametric regression, Bayesian model averaging 1
Bayesian Estimation and Testing of Structural Equation Models
 Psychometrika
, 1999
"... The Gibbs sampler can be used to obtain samples of arbitrary size from the posterior distribution over the parameters of a structural equation model (SEM) given covariance data and a prior distribution over the parameters. Point estimates, standard deviations and interval estimates for the parameter ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
The Gibbs sampler can be used to obtain samples of arbitrary size from the posterior distribution over the parameters of a structural equation model (SEM) given covariance data and a prior distribution over the parameters. Point estimates, standard deviations and interval estimates for the parameters can be computed from these samples. If the prior distribution over the parameters is uninformative, the posterior is proportional to the likelihood, and asymptotically the inferences based on the Gibbs sample are the same as those based on the maximum likelihood solution, e.g., output from LISREL or EQS. In small samples, however, the likelihood surface is not Gaussian and in some cases contains local maxima. Nevertheless, the Gibbs sample comes from the correct posterior distribution over the parameters regardless of the sample size and the shape of the likelihood surface. With an informative prior distribution over the parameters, the posterior can be used to make inferences about the parameters of underidentified models, as we illustrate on a simple errorsinvariables model.
A Bayesian Approach to Blind Source Separation
, 1999
"... This paper adopts a Bayesian statistical approach and a linear synthesis model ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
This paper adopts a Bayesian statistical approach and a linear synthesis model
A Computational Approach to Bayesian Inference
, 1996
"... xxx Although the Bayesian approach provides a complete solution to modelbased analysis, it is often di# cult to obtain closedform solutions for complex models. However, numerical solutions to Bayesian modeling problems are now becoming attractive because of the advent of powerful, lowcost comput ..."
Abstract

Cited by 17 (14 self)
 Add to MetaCart
xxx Although the Bayesian approach provides a complete solution to modelbased analysis, it is often di# cult to obtain closedform solutions for complex models. However, numerical solutions to Bayesian modeling problems are now becoming attractive because of the advent of powerful, lowcost computers and new algorithms. We describe a generalpurpose implementation of the Bayesian methodology on workstations that can deal with complex nonlinear models in a very flexible way. The models are represented by a dataflow diagram that may be manipulated by the analyst through a graphicalprogramming environment that is based on a fully objectoriented design. Maximum a posteriori solutions are achieved using a general optimization algorithm. A new technique for estimating and visualizing the uncertainties in specific aspects of the model is incorporated.
Correlated Bayesian Factor Analysis
, 1998
"... Factor analysis is a method in multivariate statistical analysis that can help scientists determine which variables to study in a field and their relationships. We extend the Bayesian approach to factor analysis developed in 1989 by Press and Shigemasu (henceforth PS89) and revised in 1997 to model ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
Factor analysis is a method in multivariate statistical analysis that can help scientists determine which variables to study in a field and their relationships. We extend the Bayesian approach to factor analysis developed in 1989 by Press and Shigemasu (henceforth PS89) and revised in 1997 to model correlated observation vectors, factor score vectors, and factor loadings. Further, we place a prior distribution on the number of factors and obtain posterior estimates. Hitherto,
Settings in social networks: A measurement model
 Sociological Methodology
, 2003
"... A class of statistical models is proposed which aims to recover latent settings structures in social networks. Settings may be regarded as clusters of vertices. The measurement model builds on two assumptions. The observed network is assumed to be generated by hierarchically nested latent transitive ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
A class of statistical models is proposed which aims to recover latent settings structures in social networks. Settings may be regarded as clusters of vertices. The measurement model builds on two assumptions. The observed network is assumed to be generated by hierarchically nested latent transitive structures, expressed by ultrametrics. It is assumed that expected tie strength decreases with ultrametric distance. The approach could be described as modelbased clustering with an ultrametric space as the underlying metric to capture the dependence in the observations. Maximum likelihood methods as well as Bayesian methods are applied for statistical inference. Both approaches are implemented using Markov chain Monte Carlo methods. 1.
Liquidity in the Futures Pits: Inferring Market Dynamics from Incomplete Data
, 2003
"... Motivated by economic models of sequential trade, empirical analyses of market dynamics frequently estimate liquidity as the coefficient of signed order flow in a pricechange regression. This paper implements such an analysis for futures transaction data from pit trading. To deal with the absence ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Motivated by economic models of sequential trade, empirical analyses of market dynamics frequently estimate liquidity as the coefficient of signed order flow in a pricechange regression. This paper implements such an analysis for futures transaction data from pit trading. To deal with the absence of timely bid and ask quotes (which are used to sign trades in most equitymarket studies), this paper proposes new techniques based on Markov chain Monte Carlo estimation. The model is estimated for four representative Chicago Mercantile Exchange contracts. The highest liquidity (lowest order flow coefficient) is found for the S&P index. Liquidity for the Euro and UK Â£ contracts is somewhat lower. The pork belly contract exhibits the least liquidity.
Process Pathway Inference via Time Series Analysis
 Experimental Mechanics
, 2003
"... Motivated by recent experimental developments in functional genomics, we construct and test a numerical technique for inferring process pathways, in which one process calls another process, from time series data. We validate using a case in which data are readily available and formulate an extension ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Motivated by recent experimental developments in functional genomics, we construct and test a numerical technique for inferring process pathways, in which one process calls another process, from time series data. We validate using a case in which data are readily available and formulate an extension, appropriate for genetic regulatory networks, which exploits Bayesian inference and in which the presentâ€“day undersampling is compensated for by prior understanding of genetic regulation. Preprint number: NSFITP0247 1