Results 1  10
of
11
Markov Chain Monte Carlo methods and the label switching problem in Bayesian mixture modelling
 Statistical Science
"... Abstract. In the past ten years there has been a dramatic increase of interest in the Bayesian analysis of finite mixture models. This is primarily because of the emergence of Markov chain Monte Carlo (MCMC) methods. While MCMC provides a convenient way to draw inference from complicated statistical ..."
Abstract

Cited by 51 (4 self)
 Add to MetaCart
Abstract. In the past ten years there has been a dramatic increase of interest in the Bayesian analysis of finite mixture models. This is primarily because of the emergence of Markov chain Monte Carlo (MCMC) methods. While MCMC provides a convenient way to draw inference from complicated statistical models, there are many, perhaps underappreciated, problems associated with the MCMC analysis of mixtures. The problems are mainly caused by the nonidentifiability of the components under symmetric priors, which leads to socalled label switching in the MCMC output. This means that ergodic averages of component specific quantities will be identical and thus useless for inference. We review the solutions to the label switching problem, such as artificial identifiability constraints, relabelling algorithms and label invariant loss functions. We also review various MCMC sampling schemes that have been suggested for mixture models and discuss posterior sensitivity to prior specification.
Bayesian finite mixtures with an unknown number of components: the allocation sampler
 University of Glasgow
, 2005
"... A new Markov chain Monte Carlo method for the Bayesian analysis of finite mixture distributions with an unknown number of components is presented. The sampler is characterized by a state space consisting only of the number of components and the latent allocation variables. Its main advantage is that ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
A new Markov chain Monte Carlo method for the Bayesian analysis of finite mixture distributions with an unknown number of components is presented. The sampler is characterized by a state space consisting only of the number of components and the latent allocation variables. Its main advantage is that it can be used, with minimal changes, for mixtures of components from any parametric family, under the assumption that the component parameters can be integrated out of the model analytically. Artificial and real data sets are used to illustrate the method and mixtures of univariate and of multivariate normals are explicitly considered. The problem of label switching, when parameter inference is of interest, is addressed in a postprocessing stage.
Populationbased reversible jump Markov chain Monte Carlo
, 2007
"... In this paper we present an extension of populationbased Markov chain Monte Carlo (MCMC) to the transdimensional case. One of the main challenges in MCMCbased inference is that of simulating from high and transdimensional target measures. In such cases, MCMC methods may not adequately traverse t ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
In this paper we present an extension of populationbased Markov chain Monte Carlo (MCMC) to the transdimensional case. One of the main challenges in MCMCbased inference is that of simulating from high and transdimensional target measures. In such cases, MCMC methods may not adequately traverse the support of the target; the simulation results will be unreliable. We develop population methods to deal with such problems, and give a result proving the uniform ergodicity of these population algorithms, under mild assumptions. This result is used to demonstrate the superiority, in terms of convergence rate, of a population transition kernel over a reversible jump sampler for a Bayesian variable selection problem. We also give an example of a population algorithm for a Bayesian multivariate mixture model with an unknown number of components. This is applied to gene expression data of 1000 data points in six dimensions and it is demonstrated that our algorithm out performs some competing Markov chain samplers.
Mixture Analysis of the Old Faithful Geyser Data Using the Package mixAK
, 2009
"... This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1 ..."
Abstract
 Add to MetaCart
This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1
Mixture Analysis of the Old Faithful Geyser Data Using the Package mixAK
, 2010
"... This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1 ..."
Abstract
 Add to MetaCart
This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1
Mixture Analysis of the Old Faithful Geyser Data Using the Package mixAK
, 2010
"... This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1 ..."
Abstract
 Add to MetaCart
This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1
Graph cutbased multiple active contours without initial contours and seed points
"... This paper presents a new graph cutbased multiple active contour algorithm to detect optimal boundaries and regions in images without initial contours. The task of multiple active contours is framed as a partitioning problem by assuming that image data are generated from a finite mixture model with ..."
Abstract
 Add to MetaCart
This paper presents a new graph cutbased multiple active contour algorithm to detect optimal boundaries and regions in images without initial contours. The task of multiple active contours is framed as a partitioning problem by assuming that image data are generated from a finite mixture model with unknown number of components. Then, the partitioning problem is solved within a divisive graph cut framework where multiway minimum cuts for multiple contours are efficiently computed in a topdown way through a swap move of binary labels. A split move is integrated into that framework to estimate the model parameters associated with regions without the use of initial contours and seed points. The number of regions is also estimated as a part of the algorithm. Experimental results of boundary and region detection of natural images are presented to demonstrate the effectiveness of the proposed algorithm. 1.
Mixture Analysis of the Old Faithful Geyser Data Using the Package mixAK
, 2010
"... This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1 ..."
Abstract
 Add to MetaCart
This document supplements a paper Komárek (2009) and shows an analysis of the Old Faithful Geyser data introduced in Härdle (1991) using the R package mixAK. The data have been analysed using mixtures by several researchers, e.g., Stephens (2000), Dellaportas and Papageorgiou (2006). 1
Robot
"... Abstract. In this paper we address the problem of estimating the parameters of a Gaussian mixture model. Although the EM (ExpectationMaximization) algorithm yields the maximumlikelihood solution it requires a careful initialization of the parameters and the optimal number of kernels in the mixture ..."
Abstract
 Add to MetaCart
Abstract. In this paper we address the problem of estimating the parameters of a Gaussian mixture model. Although the EM (ExpectationMaximization) algorithm yields the maximumlikelihood solution it requires a careful initialization of the parameters and the optimal number of kernels in the mixture may be unknown beforehand. We propose a criterion based on the entropy of the pdf (probability density function) associated to each kernel to measure the quality of a given mixture model. Two different methods for estimating Shannon entropy are proposed and a modification of the classical EM algorithm to find the optimal number of kernels in the mixture is presented. We test our algorithm in probability density estimation, pattern recognition and color image segmentation. 1 Introduction and problem formulation Gaussian Mixture models have been widely used for density estimation, pattern recognition and function approximation. One of the most common methods for fitting mixtures to data is the EM algorithm [6]. However, this algorithm is prone