Results 1  10
of
21
Gibbs Sampling Methods for StickBreaking Priors
"... ... In this paper we present two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stickbreaking priors. The first type of Gibbs sampler, referred to as a Polya urn Gibbs sampler, is a generalized version of a widely used Gibbs sampling meth ..."
Abstract

Cited by 213 (17 self)
 Add to MetaCart
... In this paper we present two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stickbreaking priors. The first type of Gibbs sampler, referred to as a Polya urn Gibbs sampler, is a generalized version of a widely used Gibbs sampling method currently employed for Dirichlet process computing. This method applies to stickbreaking priors with a known P'olya urn characterization; that is priors with an explicit and simple prediction rule. Our second method, the blocked Gibbs sampler, is based on a entirely different approach that works by directly sampling values from the posterior of the random measure. The blocked Gibbs sampler can be viewed as a more general approach as it works without requiring an explicit prediction rule. We find that the blocked Gibbs avoids some of the limitations seen with the Polya urn approach and should be simpler for nonexperts to use.
Bayesian density regression
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY B
, 2007
"... This article considers Bayesian methods for density regression, allowing a random probability distribution to change flexibly with multiple predictors. The conditional response distribution is expressed as a nonparametric mixture of parametric densities, with the mixture distribution changing acc ..."
Abstract

Cited by 40 (23 self)
 Add to MetaCart
This article considers Bayesian methods for density regression, allowing a random probability distribution to change flexibly with multiple predictors. The conditional response distribution is expressed as a nonparametric mixture of parametric densities, with the mixture distribution changing according to location in the predictor space. A new class of priors for dependent random measures is proposed for the collection of random mixing measures at each location. The conditional prior for the random measure at a given location is expressed as a mixture of a Dirichlet process (DP) distributed innovation measure and neighboring random measures. This specification results in a coherent prior for the joint measure, with the marginal random measure at each location being a finite mixture of DP basis measures. Integrating out the infinitedimensional collection of mixing measures, we obtain a simple expression for the conditional distribution of the subjectspecific random variables, which generalizes the Pólya urn scheme. Properties are considered and a simple Gibbs sampling algorithm is developed for posterior computation. The methods are illustrated using simulated data examples and epidemiologic studies.
Nonparametric Bayesian Data Analysis
"... We review the current state of nonparametric Bayesian inference. The discussion follows a list of important statistical inference problems, including density estimation, regression, survival analysis, hierarchical models and model validation. For each inference problem we review relevant nonparametr ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We review the current state of nonparametric Bayesian inference. The discussion follows a list of important statistical inference problems, including density estimation, regression, survival analysis, hierarchical models and model validation. For each inference problem we review relevant nonparametric Bayesian models and approaches including Dirichlet process (DP) models and variations, Polya trees, wavelet based models, neural network models, spline regression, CART, dependent DP models, and model validation with DP and Polya tree extensions of parametric models. 1
The Block Diagonal Infinite Hidden Markov Model
"... The Infinite Hidden Markov Model (IHMM) extends hidden Markov models to have a countably infinite number of hidden states (Beal et al., 2002; Teh et al., 2006). We present a generalization of this framework that introduces nearly blockdiagonal structure in the transitions between the hidden states, ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
The Infinite Hidden Markov Model (IHMM) extends hidden Markov models to have a countably infinite number of hidden states (Beal et al., 2002; Teh et al., 2006). We present a generalization of this framework that introduces nearly blockdiagonal structure in the transitions between the hidden states, where blocks correspond to “subbehaviors” exhibited by data sequences. In identifying such structure, the model classifies, or partitions, sequence data according to these subbehaviors in an unsupervised way. We present an application of this model to artificial data, a video gesture classification task, and a musical theme labeling task, and show that components of the model can also be applied to graph segmentation. 1
Nonparametric empirical Bayes for the Dirichlet process mixture model
 Statistics and Computing
, 2004
"... The Dirichlet process prior allows flexible nonparametric mixture modeling. The number of mixture components is not specified in advance and can grow as new data come in. However, the behavior of the model is sensitive to the choice of the parameters, including an infinitedimensional distribution ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The Dirichlet process prior allows flexible nonparametric mixture modeling. The number of mixture components is not specified in advance and can grow as new data come in. However, the behavior of the model is sensitive to the choice of the parameters, including an infinitedimensional distributional parameter G0 . Most previous applications have either fixed G0 as a member of a parametric family or treated G0 in a Bayesian fashion, using parametric prior specifications. In contrast, we have developed an adaptive nonparametric method for constructing smooth estimates of G0 . We combine this method with a technique for estimating #, the other Dirichlet process parameter, that is inspired by an existing characterization of its maximumlikelihood estimator. Together, these estimation procedures yield a flexible empirical Bayes treatment of Dirichlet process mixtures. Such a treatment is useful in situations where smooth point estimates of G0 are of intrinsic interest, or where the structure of G0 cannot be conveniently modeled with the usual parametric prior families. Analysis of simulated and realworld datasets illustrates the robustness of this approach.
MEASURING EXPECTATIONS IN OPTIONS MARKETS: AN APPLICATION TO THE S&P500 INDEX
, 901
"... ABSTRACT. Extracting market expectations has always been an important issue when making national policies and investment decisions in financial markets. In option markets, the most popular way has been to extract implied volatilities to assess the future variability of the underlying with the use of ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
ABSTRACT. Extracting market expectations has always been an important issue when making national policies and investment decisions in financial markets. In option markets, the most popular way has been to extract implied volatilities to assess the future variability of the underlying with the use of the Black & Scholes formula. In this manuscript, we propose a novel way to extract the whole time varying distribution of the market implied asset price from option prices. We use a Bayesian nonparametric method that makes use of the Sethuraman representation for Dirichlet processes to take into account the evolution of distributions in time. As an illustration, we present the analysis of options on the S&P500 index. 1.
Bayesian Dynamic Modeling of Latent Trait
 Distributions,” Biostatistics
, 2006
"... distributions ..."
StickBreaking Beta Processes and the Poisson Process
"... We show that the stickbreaking construction of the beta process due to Paisley et al. (2010) can be obtained from the characterization of the beta process as a Poisson process. Specifically, we show that the mean measure of the underlying Poisson process is equal to that of the beta process. We use ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We show that the stickbreaking construction of the beta process due to Paisley et al. (2010) can be obtained from the characterization of the beta process as a Poisson process. Specifically, we show that the mean measure of the underlying Poisson process is equal to that of the beta process. We use this underlying representation to derive error bounds on truncated beta processes that are tighter than those in the literature. We also develop a new MCMC inference algorithm for beta processes, based in part on our new Poisson process construction. 1
On Simulation Methods for Two Component Normal Mixture Models under Bayesian Approach
, 2009
"... EMAlgorithm and Gibbs sampler are two useful Bayesian simulation methods for parameter estimation of finite normal mixture model. The EMAlgorithm is an iterative estimate of maximum likelihood for incomplete data problem. Gibbs sampler is an approach of generating random sample from a multivariate ..."
Abstract
 Add to MetaCart
EMAlgorithm and Gibbs sampler are two useful Bayesian simulation methods for parameter estimation of finite normal mixture model. The EMAlgorithm is an iterative estimate of maximum likelihood for incomplete data problem. Gibbs sampler is an approach of generating random sample from a multivariate distribution. We introduce and derive Dempster EMAlgorithm for the twocomponent normal mixture models to get the iterative computation estimates, also use data augmentation and general Gibbs sampler to get the sample from posterior distribution under conjugate prior. The estimate results from both simulation methods under twocomponent normal mixture model with unknown mean parameters are compared and the connections and differences between both methods are represented. Data set from astronomy is used for comparison. Acknowledgement I would like to thank my supervisor Silvelyn Zwanzig for the patience, guidance and encouragement that she always gave to me, not only in the thesis, but also in the whole procedure of my statistics studying. I would also like to thank my friend Han Jun for the the assistances of LATEX, thank Alena for the data source, and thank my parents for the spiritual and substantial support and wholesouled love they gave me all my life. At last I would like to thank the department of mathematics of Uppsala University for giving me the opportunity to study. Contents 1
Functional Clustering in Nested Designs
"... Summary. We discuss functional clustering procedures for nested designs, where multiple curves are collected for each subject in the study. We start by considering the application of standard functional clustering tools to this problem, which leads to groupings based on the average profile for each ..."
Abstract
 Add to MetaCart
Summary. We discuss functional clustering procedures for nested designs, where multiple curves are collected for each subject in the study. We start by considering the application of standard functional clustering tools to this problem, which leads to groupings based on the average profile for each subject. After discussing some of the shortcomings of this approach, we present a mixture model based on a generalization of the nested Dirichlet process that clusters subjects based on the distribution of their curves. By using mixtures of generalized Dirichlet processes, the model induces a much more flexible prior on the partition structure than other popular modelbased clustering methods, allowing for different rates of introduction of new clusters as the number of observations increases. The methods are illustrated using hormone profiles from multiple menstrual cycles collected for women in the Early Pregnancy Study.