Results 1  10
of
12
Dirichlet Prior Sieves in Finite Normal Mixtures
 Statistica Sinica
, 2002
"... Abstract: The use of a finite dimensional Dirichlet prior in the finite normal mixture model has the effect of acting like a Bayesian method of sieves. Posterior consistency is directly related to the dimension of the sieve and the choice of the Dirichlet parameters in the prior. We find that naive ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
Abstract: The use of a finite dimensional Dirichlet prior in the finite normal mixture model has the effect of acting like a Bayesian method of sieves. Posterior consistency is directly related to the dimension of the sieve and the choice of the Dirichlet parameters in the prior. We find that naive use of the popular uniform Dirichlet prior leads to an inconsistent posterior. However, a simple adjustment to the parameters in the prior induces a random probability measure that approximates the Dirichlet process and yields a posterior that is strongly consistent for the density and weakly consistent for the unknown mixing distribution. The dimension of the resulting sieve can be selected easily in practice and a simple and efficient Gibbs sampler can be used to sample the posterior of the mixing distribution. Key words and phrases: BoseEinstein distribution, Dirichlet process, identification, method of sieves, random probability measure, relative entropy, weak convergence.
Modelling spatially correlated data via mixtures: a Bayesian approach
 Journal of the Royal Statistical Society, Series B
, 2002
"... This paper develops mixture models for spatially indexed data. We confine attention to the case of finite, typically irregular, patterns of points or regions with prescribed spatial relationships, and to problems where it is only the weights in the mixture that vary from one location to another. Our ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
This paper develops mixture models for spatially indexed data. We confine attention to the case of finite, typically irregular, patterns of points or regions with prescribed spatial relationships, and to problems where it is only the weights in the mixture that vary from one location to another. Our specific focus is on Poisson distributed data, and applications in disease mapping. We work in a Bayesian framework, with the Poisson parameters drawn from gamma priors, and an unknown number of components. We propose two alternative models for spatiallydependent weights, based on transformations of autoregressive gaussian processes: in one (the Logistic normal model), the mixture component labels are exchangeable, in the other (the Grouped continuous model), they are ordered. Reversible jump Markov chain Monte Carlo algorithms for posterior inference are developed. Finally, the performance of both of these formulations is examined on synthetic data and real data on mortality from rare disease.
Approximate Dirichlet Process Computing in Finite Normal Mixtures: Smoothing and Prior Information
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2000
"... ..."
Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2001
"... ..."
Penalized Maximum Likelihood Estimator for Normal Mixtures
, 2000
"... The estimation of the parameters of a mixture of Gaussian densities is considered, within the framework of maximum likelihood. Due to unboundedness of the likelihood function, the maximum likelihood estimator fails to exist. We adopt a solution to likelihood function degeneracy which consists in pen ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
The estimation of the parameters of a mixture of Gaussian densities is considered, within the framework of maximum likelihood. Due to unboundedness of the likelihood function, the maximum likelihood estimator fails to exist. We adopt a solution to likelihood function degeneracy which consists in penalizing the likelihood function. The resulting penalized likelihood function is then bounded over the parameter space and the existence of the penalized maximum likelihood estimator is granted. As original contribution we provide asymptotic properties, and in particular a consistency proof, for the penalized maximum likelihood estimator. Numerical examples are provided in the finite data case, showing the performances of the penalized estimator compared to the standard one.
Bayesian MetaAnalysis for Longitudinal Data Models using Multivariate Mixture Priors
, 2002
"... ..."
Bayesian Latent Semantic Analysis of Multimedia Databases
, 2001
"... We present a Bayesian mixture model for probabilistic latent semantic analysis of documents with images and text. The Bayesian perspective allows us to perform automatic regularisation to obtain sparser and more coherent clustering models. It also enables us to encode a priori knowledge, such as ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
We present a Bayesian mixture model for probabilistic latent semantic analysis of documents with images and text. The Bayesian perspective allows us to perform automatic regularisation to obtain sparser and more coherent clustering models. It also enables us to encode a priori knowledge, such as word and image preferences. The learnt model can be used for browsing digital databases, information retrieval with image and/or text queries, image annotation (adding words to an image) and text illustration (adding images to a text).
Bayesian Sampling for Mixtures of Factor Analysers
 University of Glasgow, Department of Statistics, University of Glasgow
, 2000
"... We consider the Mixture of Factor Analysers model with both the number of components and the number of common factors known and xed, and we develop an ecient Bayesian Sampling approach to the analysis of the model. We use Data Augmentation to handle the incompletedata structure of the model, and, f ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We consider the Mixture of Factor Analysers model with both the number of components and the number of common factors known and xed, and we develop an ecient Bayesian Sampling approach to the analysis of the model. We use Data Augmentation to handle the incompletedata structure of the model, and, from appropriately specied conjugate priors, we construct full conditional conjugate posterior distributions that are all standard and easy to simulate, allowing us to derive an ecient framework combining both the ImputationPosterior algorithm and the Gibbs Sampler. Taking full advantage of the generated MCMC samples and without any extra computation, we provide estimates and standard errors for both the parameters and the latent variables, and we perform a posterior predictive goodnessoft assessment on the derived model using replicates and realised discrepancies. Application of our approach to both articial and real life data produces very good performance. Keywords: Latent Variable...
Clustering Methods based on Variational Analysis in the Space of Measures
 Biometrika
, 2000
"... We apply Ward's optimality criterion for a Poisson cluster centre process, conditioned to have xed total mass. The corresponding variational problem for the intensity measure of the process is discussed both in its asymptotic form yielding an analytic solution and the exact form that calls for using ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We apply Ward's optimality criterion for a Poisson cluster centre process, conditioned to have xed total mass. The corresponding variational problem for the intensity measure of the process is discussed both in its asymptotic form yielding an analytic solution and the exact form that calls for using numerical algorithms of steepest descent type. The results are illustrated on synthetic examples and on a dataset concerning the positions of redwood seedlings (Strauss, 1975). Primary: 60G55, 60D05; Secondary 49M10, 49K27. Key words: cluster analysis, optimisation on measures, Poisson point process, steepest descent. Work carried out under project PNA4.3 `Stochastic geometry'. The research of the rst two authors was funded by a British Council /NWO Research grant JRP544/BR62477. May 3, 2000 1 Introduction The term cluster analysis incorporates a wide class of techniques for dividing data `points' representing individuals or objects into groups. Such techniques are widely used in e...
Mixture Models in Econometric Duration Analysis
, 2002
"... Econometric duration analysis has become an important part of methodology in econometrics, bringing forth a plenty of applications. The probability distribution of the duration of a time span is modeled through its conditional hazard rate given the covariates. When some of the covariates are unobser ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Econometric duration analysis has become an important part of methodology in econometrics, bringing forth a plenty of applications. The probability distribution of the duration of a time span is modeled through its conditional hazard rate given the covariates. When some of the covariates are unobservable, the duration, given the observable covariates, has a mixture distribution. The paper surveys and discusses...