Results 1 
5 of
5
Bayesian density regression
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY B
, 2007
"... This article considers Bayesian methods for density regression, allowing a random probability distribution to change flexibly with multiple predictors. The conditional response distribution is expressed as a nonparametric mixture of parametric densities, with the mixture distribution changing acc ..."
Abstract

Cited by 40 (23 self)
 Add to MetaCart
This article considers Bayesian methods for density regression, allowing a random probability distribution to change flexibly with multiple predictors. The conditional response distribution is expressed as a nonparametric mixture of parametric densities, with the mixture distribution changing according to location in the predictor space. A new class of priors for dependent random measures is proposed for the collection of random mixing measures at each location. The conditional prior for the random measure at a given location is expressed as a mixture of a Dirichlet process (DP) distributed innovation measure and neighboring random measures. This specification results in a coherent prior for the joint measure, with the marginal random measure at each location being a finite mixture of DP basis measures. Integrating out the infinitedimensional collection of mixing measures, we obtain a simple expression for the conditional distribution of the subjectspecific random variables, which generalizes the Pólya urn scheme. Properties are considered and a simple Gibbs sampling algorithm is developed for posterior computation. The methods are illustrated using simulated data examples and epidemiologic studies.
A note on the Dirichlet process prior in Bayesian nonparametric inference with partial exchangeability
 Statist. Prob. Letters
, 1997
"... We consider Bayesian nonparametric inference for continuousvalued partially exchangeable data, when the partition of the observations into groups is unknown. This includes changepoint problems and mixture models. As the prior, we consider a mixture of products of Dirichlet processes. We show that ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
We consider Bayesian nonparametric inference for continuousvalued partially exchangeable data, when the partition of the observations into groups is unknown. This includes changepoint problems and mixture models. As the prior, we consider a mixture of products of Dirichlet processes. We show that the discreteness of the Dirichlet process can have a large effect on inference (posterior distributions and Bayes factors), leading to conclusions that can be different from those that result from a reasonable parametric model. When the observed data are all distinct, the effect of the prior on the posterior is to favor more evenly balanced partitions, and its effect on Bayes factors is to favor more groups. In a hierarchical model with a Dirichlet process as the secondstage prior, the prior can also have a large effect on inference, but in the opposite direction, towards more unbalanced partitions. (~) 1997 Elsevier Science B.V.
Nonparametric Bayesian Data Analysis
"... We review the current state of nonparametric Bayesian inference. The discussion follows a list of important statistical inference problems, including density estimation, regression, survival analysis, hierarchical models and model validation. For each inference problem we review relevant nonparametr ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We review the current state of nonparametric Bayesian inference. The discussion follows a list of important statistical inference problems, including density estimation, regression, survival analysis, hierarchical models and model validation. For each inference problem we review relevant nonparametric Bayesian models and approaches including Dirichlet process (DP) models and variations, Polya trees, wavelet based models, neural network models, spline regression, CART, dependent DP models, and model validation with DP and Polya tree extensions of parametric models. 1
BCP²: an environment to run Markov Chains for Bayesian Change Point Problems
"... We study a Bayesian nonparametric model for change point problems and propose a Markov chain method to approximate the posterior distributions of interest. The program developed to run the Gibbs sampler is called BCP 2 (Bayesian Change Point Problem). Its userfriendly graphical interface enables ..."
Abstract
 Add to MetaCart
We study a Bayesian nonparametric model for change point problems and propose a Markov chain method to approximate the posterior distributions of interest. The program developed to run the Gibbs sampler is called BCP 2 (Bayesian Change Point Problem). Its userfriendly graphical interface enables the user to enter the values of some parameters of interest and immediately obtain the corresponding statistical inference. This allows one to perform sensitivity analysis to changes in the likelihood, to check robustness of the algorithm to misspecification of the prior and to compare the proposed model to the more standard parametric approach. Three methods to obtain estimates of the distribution function before and after the change point are presented, implemented and compared. The first data set analyzed describes the annual volume of discharge from the Nile river at Aswan, the second one describes how the behavior of a young monkey is affected by the birth of a sibling. KEY WORDS: Gibb...
CRiSM Paper No. 0707v2, www.warwick.ac.uk/go/crism
, 2007
"... Sequential changepoint detection for time series models: assessing the functional dynamics of neuronal networks. ..."
Abstract
 Add to MetaCart
Sequential changepoint detection for time series models: assessing the functional dynamics of neuronal networks.