Results 1  10
of
121
Probabilistic Inference Using Markov Chain Monte Carlo Methods
, 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over highdimensional spaces. R ..."
Abstract

Cited by 740 (24 self)
 Add to MetaCart
Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over highdimensional spaces. Related problems in other fields have been tackled using Monte Carlo methods based on sampling using Markov chains, providing a rich array of techniques that can be applied to problems in artificial intelligence. The "Metropolis algorithm" has been used to solve difficult problems in statistical physics for over forty years, and, in the last few years, the related method of "Gibbs sampling" has been applied to problems of statistical inference. Concurrently, an alternative method for solving problems in statistical physics by means of dynamical simulation has been developed as well, and has recently been unified with the Metropolis algorithm to produce the "hybrid Monte Carlo" method. In computer science, Markov chain sampling is the basis of the heuristic optimization technique of "simulated annealing", and has recently been used in randomized algorithms for approximate counting of large sets. In this review, I outline the role of probabilistic inference in artificial intelligence, present the theory of Markov chains, and describe various Markov chain Monte Carlo algorithms, along with a number of supporting techniques. I try to present a comprehensive picture of the range of methods that have been developed, including techniques from the varied literature that have not yet seen wide application in artificial intelligence, but which appear relevant. As illustrative examples, I use the problems of probabilistic inference in expert systems, discovery of latent classes from data, and Bayesian learning for neural networks.
Bayesian Treed Gaussian Process Models with an Application to Computer Modeling
 Journal of the American Statistical Association
, 2007
"... This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian proce ..."
Abstract

Cited by 87 (19 self)
 Add to MetaCart
This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian processes and simple linear models can yield a more parsimonious spatial model while significantly reducing computational effort. The methodological developments and statistical computing details which make this approach efficient are described in detail. Illustrations of our model are given for both synthetic and real datasets. Key words: recursive partitioning, nonstationary spatial model, nonparametric regression, Bayesian model averaging 1
From Laplace To Supernova Sn 1987a: Bayesian Inference In Astrophysics
, 1990
"... . The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions ..."
Abstract

Cited by 68 (2 self)
 Add to MetaCart
. The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions to wellposed statistical problems, and is historically the original approach to statistics. The reasons for earlier rejection of Bayesian methods are discussed, and it is noted that the work of Cox, Jaynes, and others answers earlier objections, giving Bayesian inference a firm logical and mathematical foundation as the correct mathematical language for quantifying uncertainty. The Bayesian approaches to parameter estimation and model comparison are outlined and illustrated by application to a simple problem based on the gaussian distribution. As further illustrations of the Bayesian paradigm, Bayesian solutions to two interesting astrophysical problems are outlined: the measurement of wea...
Bayesian Estimation and Testing of Structural Equation Models
 Psychometrika
, 1999
"... The Gibbs sampler can be used to obtain samples of arbitrary size from the posterior distribution over the parameters of a structural equation model (SEM) given covariance data and a prior distribution over the parameters. Point estimates, standard deviations and interval estimates for the parameter ..."
Abstract

Cited by 44 (10 self)
 Add to MetaCart
The Gibbs sampler can be used to obtain samples of arbitrary size from the posterior distribution over the parameters of a structural equation model (SEM) given covariance data and a prior distribution over the parameters. Point estimates, standard deviations and interval estimates for the parameters can be computed from these samples. If the prior distribution over the parameters is uninformative, the posterior is proportional to the likelihood, and asymptotically the inferences based on the Gibbs sample are the same as those based on the maximum likelihood solution, e.g., output from LISREL or EQS. In small samples, however, the likelihood surface is not Gaussian and in some cases contains local maxima. Nevertheless, the Gibbs sample comes from the correct posterior distribution over the parameters regardless of the sample size and the shape of the likelihood surface. With an informative prior distribution over the parameters, the posterior can be used to make inferences about the parameters of underidentified models, as we illustrate on a simple errorsinvariables model.
Liquidity in the Futures Pits: Inferring Market Dynamics from Incomplete Data
, 2003
"... Motivated by economic models of sequential trade, empirical analyses of market dynamics frequently estimate liquidity as the coefficient of signed order flow in a pricechange regression. This paper implements such an analysis for futures transaction data from pit trading. To deal with the absence ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
Motivated by economic models of sequential trade, empirical analyses of market dynamics frequently estimate liquidity as the coefficient of signed order flow in a pricechange regression. This paper implements such an analysis for futures transaction data from pit trading. To deal with the absence of timely bid and ask quotes (which are used to sign trades in most equitymarket studies), this paper proposes new techniques based on Markov chain Monte Carlo estimation. The model is estimated for four representative Chicago Mercantile Exchange contracts. The highest liquidity (lowest order flow coefficient) is found for the S&P index. Liquidity for the Euro and UK £ contracts is somewhat lower. The pork belly contract exhibits the least liquidity.
A Bayesian Approach to Blind Source Separation
, 1999
"... This paper adopts a Bayesian statistical approach and a linear synthesis model ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
This paper adopts a Bayesian statistical approach and a linear synthesis model
A Computational Approach to Bayesian Inference
, 1996
"... xxx Although the Bayesian approach provides a complete solution to modelbased analysis, it is often di# cult to obtain closedform solutions for complex models. However, numerical solutions to Bayesian modeling problems are now becoming attractive because of the advent of powerful, lowcost comput ..."
Abstract

Cited by 17 (14 self)
 Add to MetaCart
xxx Although the Bayesian approach provides a complete solution to modelbased analysis, it is often di# cult to obtain closedform solutions for complex models. However, numerical solutions to Bayesian modeling problems are now becoming attractive because of the advent of powerful, lowcost computers and new algorithms. We describe a generalpurpose implementation of the Bayesian methodology on workstations that can deal with complex nonlinear models in a very flexible way. The models are represented by a dataflow diagram that may be manipulated by the analyst through a graphicalprogramming environment that is based on a fully objectoriented design. Maximum a posteriori solutions are achieved using a general optimization algorithm. A new technique for estimating and visualizing the uncertainties in specific aspects of the model is incorporated.
Elicited Priors for Bayesian Model Specifications
 in Political Science Research. Forthcoming, Journal of Politics
, 2005
"... We explain how to use elicited priors in Bayesian political science research. These are a form of prior information produced by previous knowledge from structured interviews with subjective area experts who have little or no concern for the statistical aspects of the project. The purpose is to intro ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
We explain how to use elicited priors in Bayesian political science research. These are a form of prior information produced by previous knowledge from structured interviews with subjective area experts who have little or no concern for the statistical aspects of the project. The purpose is to introduce qualitative and areaspecific information into an empirical model in a systematic and organized manner in order to produce parsimonious yet realistic implications. Currently, there is no work in political science that articulates elicited priors in a Bayesian specification. We demonstrate the value of the approach by applying elicited priors to a problem in judicial comparative politics using data and elicitations we collected in Nicaragua. As quantitative political research becomes increasingly sophisticated, the more complex, but more capable, Bayesian approach is likely to grow in popularity. The Bayesian inferential engine is a coherent set of axioms that converts prior information to posterior evidence by conditioning on observed data. Thus, stipulating prior distributions for unknown quantities is a requirement, and this requirement has been a longstanding source of controversy. Bayesians statistics provides a number of ways to define prior information, and the strength of these
Stock Market Participation and the Internet
 Journal of Financial and Quantitative Analysis
, 2008
"... In the U.S., individual participation in the stock market is lower than would be predicted given the riskadjusted expected returns from holding stock. Theory indicates that certain types of frictions (transaction costs, information costs, etc.) could account for the lowerthanexpected stock market ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(Show Context)
In the U.S., individual participation in the stock market is lower than would be predicted given the riskadjusted expected returns from holding stock. Theory indicates that certain types of frictions (transaction costs, information costs, etc.) could account for the lowerthanexpected stock market participation rates. The technological development of the Internet in the 1990s reduced some of these frictions. As a result, we should expect a corresponding increase in stock market participation. Using panel data, this paper examines empirically the hypothesis that there was a fundamental change in stock market participation in the last decade and then links this change to the advent of the Internet. Unlike the recent literature that focuses on an increase in transactions for individuals that already have been participating in the stock market, this paper examines the rise in participation among households not previously participating in the market. The results indicate that households that are more comfortable using computers/Internet raised participation substantially more than households that are less comfortable using computers/Internet. In terms of the probability of participation, using a computer/Internet was equivalent to having over $20,500 in additional mean household income or over 2 more mean years of education. Notably, the increase in participation that we observe in the data comes from an older subset of the population, one that is least likely to increase stock market participation. In this sense, the findings in this paper can be interpreted as a lower bound on the actual effects of the Internet on stock market participation. (JEL: G10, G11, D14)
Risk and Uncertainty Analysis in Government Safety Decisions
 Risk Analysis
, 2002
"... Probabilistic risk analysis (PRA) can be an effective tool to assess risks and uncertainties and to set priorities among safety policy options. Based on systems analysis and on Bayesian probability, PRA has been applied to a wide range of cases, three of which are briefly presented here: the mainten ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
(Show Context)
Probabilistic risk analysis (PRA) can be an effective tool to assess risks and uncertainties and to set priorities among safety policy options. Based on systems analysis and on Bayesian probability, PRA has been applied to a wide range of cases, three of which are briefly presented here: the maintenance of the tiles of the space shuttle, the management of patient risk in anesthesia, and the choice of seismic provisions of building codes for the San Francisco Bay Area. In the quantification of a risk, a number of problems arise in the public sector where multiple stakeholders are involved. In this paper, I describe different approaches to the treatments of uncertainties in risk analysis, their implications for risk ranking, and the role of risk analysis results in the context of a safety decision process. I also discuss the implications of adopting conservative hypotheses before proceeding to what is, in essence, a conditional uncertainty analysis, and I explore some implications of different levels of &quot;conservatism &quot; for the ranking of risk mitigation measures.