Results 1 
3 of
3
Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2001
"... ..."
Estimating Components in Finite Mixtures and Hidden Markov Models
, 2003
"... When the unobservable Markov chain in a hidden Markov model is stationary the marginal distribution of the observations is a finite mixture with the number of terms equal to the number of the states of the Markov chain. This suggests estimating the number of states of the unobservable Markov chain b ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
When the unobservable Markov chain in a hidden Markov model is stationary the marginal distribution of the observations is a finite mixture with the number of terms equal to the number of the states of the Markov chain. This suggests estimating the number of states of the unobservable Markov chain by determining the number of mixture components in the marginal distribution. We therefore present new methods for estimating the number of states in a hidden Markov model, and coincidentally the unknown number of components in a finite mixture, based on penalized quasilikelihood and generalized quasilikelihood ratio methods constructed from the marginal distribution. The procedures advocated are simple to calculate and results obtained in empirical applications indicate that they are as effective as current available methods based on the full likelihood. We show that, under fairly general regularity conditions, the methods proposed will generate strongly consistent estimates of the unknown number of states or components. Some key words: finite mixture, hidden Markov process, model selection, number of states, penalized quasilikelihood, generalized quasilikelihood ratio, strong consistency. 1
MICE – Multiplepeak Identification, Characterization and Estimation
, 2005
"... – is a general procedure for estimating a lower bound for the number of components and for estimating their parameters in an additive regression model. The method consists of a series of steps: a preliminary step for separating the signal from the background followed by identification of local maxim ..."
Abstract
 Add to MetaCart
– is a general procedure for estimating a lower bound for the number of components and for estimating their parameters in an additive regression model. The method consists of a series of steps: a preliminary step for separating the signal from the background followed by identification of local maxima up to a noise leveldependent threshold, estimation of the component parameters using an iterative algorithm, and detection of mixtures of components within one local maximum using hypothesis testing. The leading example is a nuclear magnetic resonance (NMR) experiment for protein structure determination. After applying a Fourier transform to the NMR signals, NMR frequency data are multiplepeak data, where each peak corresponds to one component in the additive regression model. In this example, the primary objective is accurate estimation of the location parameters. Key words and phrases: mixture regression model, tensorproduct wavelet decomposition, noise leveldependent threshold, backfitting, mixture detection, nuclear magnetic resonance, protein structure determination.