Results 1 
5 of
5
Estimating the integrated likelihood via posterior simulation using the harmonic mean identity
 Bayesian Statistics
, 2007
"... The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison a ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison and Bayesian testing is a ratio of integrated likelihoods, and the model weights in Bayesian model averaging are proportional to the integrated likelihoods. We consider the estimation of the integrated likelihood from posterior simulation output, aiming at a generic method that uses only the likelihoods from the posterior simulation iterations. The key is the harmonic mean identity, which says that the reciprocal of the integrated likelihood is equal to the posterior harmonic mean of the likelihood. The simplest estimator based on the identity is thus the harmonic mean of the likelihoods. While this is an unbiased and simulationconsistent estimator, its reciprocal can have infinite variance and so it is unstable in general. We describe two methods for stabilizing the harmonic mean estimator. In the first one, the parameter space is reduced in such a way that the modified estimator involves a harmonic mean of heaviertailed densities, thus resulting in a finite variance estimator. The resulting
ModelBased Clustering With Dissimilarities: A Bayesian Approach
"... A Bayesian modelbased clustering method is proposed for clustering objects on the basis of dissimilarites. This combines two basic ideas. The first is that the objects have latent positions in a Euclidean space, and that the observed dissimilarities are measurements of the Euclidean distances with ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
A Bayesian modelbased clustering method is proposed for clustering objects on the basis of dissimilarites. This combines two basic ideas. The first is that the objects have latent positions in a Euclidean space, and that the observed dissimilarities are measurements of the Euclidean distances with error. The second idea is that the latent positions are generated from a mixture of multivariate normal distributions, each one corresponding to a cluster. We estimate the resulting model in a Bayesian way using Markov chain Monte Carlo. The method carries out multidimensional scaling and modelbased clustering simultaneously, and yields good object configurations and good clustering results with reasonable measures of clustering uncertainties. In the examples we study, the clustering results based on lowdimensional configurations were almost as good as those based on highdimensional ones. Thus, the method can be used as a tool for dimension reduction when clustering highdimensional objects, which may be useful especially for visual inspection of clusters. We also propose a Bayesian criterion for choosing the dimension of the object configuration and the number of clusters simultaneously. This is easy to compute and works reasonably well in simulations and real examples.
Determining the Number of Colors or Gray Levels in an Image Using Approximate Bayes Factors: The Pseudolikelihood Information Criterion (PLIC)
 PLIC), IEEE Transactions on Pattern Analysis and Machine Intelligence 24
, 2001
"... We propose a method for choosing the number of colors, or true gray levels, in an image. This is motivated by medical and satellite image segmentation, and may also be useful for color and gray scale image quantization, the display and storage of computergenerated holograms, and the use of cooccurr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We propose a method for choosing the number of colors, or true gray levels, in an image. This is motivated by medical and satellite image segmentation, and may also be useful for color and gray scale image quantization, the display and storage of computergenerated holograms, and the use of cooccurrence matrices for assessing texture in images. Our underlying probability model is a hidden Markov random field. Each number of colors considered is viewed as corresponding to a statistical model for the image, and the resulting models are compared via approximate Bayes factors. The Bayes factors are approximated using BIC, where the required maximized likelihood is approximated by the QianTitterington pseudo likelihood. We call the resulting criterion PLIC (Pseudolikelihood Information Criterion). We also discuss a simpler approximation, MMIC (Marginal Mixture Information Criterion), which is based only on the marginal distribution of pixel values. This turns out to be useful for initialization, and also to have moderately good, albeit suboptimal, performance in its own right. We apply PLIC to three examples: a simulated twoband image, a medical segmentation problem, and a satellite image, and in each case it gives good results in practice. Keywords: BIC; Color image quantization; Cooccurrence matrix; Hologram; ICM algorithm; Image segmentation; Markov Random Field; Medical image; Mixture model; Posterior model probability; Pseudolikelihood; Satellite image.
Modelbased clustering with dissimilarities: A Bayesian approach
, 2003
"... www.stat.washington.edu/raftery. Oh’s research was supported by research funds from KOSEF ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
www.stat.washington.edu/raftery. Oh’s research was supported by research funds from KOSEF