Results 1 
7 of
7
Noise benefits in the expectation–maximization algorithm: NEM theorems and models
 in The Int. Joint Conf. Neural Networks (IJCNN ) (IEEE
"... ar ..."
(Show Context)
The noisy expectation–maximization algorithm (in review
, 2012
"... We present a noiseinjected version of the expectation–maximization (EM) algorithm: the noisy expectation–maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that additive noise speeds up the average convergence of the EM ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We present a noiseinjected version of the expectation–maximization (EM) algorithm: the noisy expectation–maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that additive noise speeds up the average convergence of the EM algorithm to a local maximum of the likelihood surface if a positivity condition holds. Corollary results give special cases when noise improves the EM algorithm. We demonstrate these noise benefits on EM algorithms for three data models: the Gaussian mixture model (GMM), the Cauchy mixture model (CMM), and the censored logconvex gamma model. The NEM positivity condition simplifies to a quadratic inequality in the GMM and CMM cases. A final theorem shows that the noise benefit for independent identically distributed additive noise decreases with sample size in mixture models. This theorem implies that the noise benefit is most pronounced if the data is sparse.
Gaussian Mixture Noise ImprovingNonlinear Multiple Signal Detection*
"... Abstract Based on a nonlinear signal detector, stochastic resonance (SR) of noiseimproved multiple signal detection is discussed for Gaussian mixture noise with different parameters. Signal transits from suprathreshold to subthreshold when the threshold of the nonlinear signal detector is raised ..."
Abstract
 Add to MetaCart
Abstract Based on a nonlinear signal detector, stochastic resonance (SR) of noiseimproved multiple signal detection is discussed for Gaussian mixture noise with different parameters. Signal transits from suprathreshold to subthreshold when the threshold of the nonlinear signal detector is raised and noise transits from unimodal to bimodal when the parameter in Gaussian mixture noise increases, the occurrence of SR is studied in detail. The evolution of SR is a gradual process, which isn't affected completely by whether the signal is suprathreshold or subthreshold, or by whether the noise is unimodal or bimodal. Based on computer simulation, for different fixed noise parameters and detector thresholds, the existence of SR shows a variety of circumstances and the detection efficacy can be improved rapidly when the sample number increases. These results show that the occurrence of SR depends strongly on the parameter in Gaussian mixture noise, the threshold of the nonlinear multiple signal detector and the sample number, and is their complex synergy. Index Terms multiple signal detection, probability of detection error, Gaussian mixture noise, stochastic resonance. 1.
Digital Signal Processing Weak signal detection: Condition for noise induced enhancement
, 2013
"... For the detection of a weak known signal in additive white noise, a generalized correlation detector is considered. In the case of a large number of measurements, an asymptotic efficacy is analytically computed as a general measure of detection performance. The derivative of the efficacy with respe ..."
Abstract
 Add to MetaCart
(Show Context)
For the detection of a weak known signal in additive white noise, a generalized correlation detector is considered. In the case of a large number of measurements, an asymptotic efficacy is analytically computed as a general measure of detection performance. The derivative of the efficacy with respect to the noise level is also analytically computed. Positivity of this derivative is the condition for enhancement of the detection performance by increasing the level of noise. The behavior of this derivative is analyzed in various important situations, especially showing when noiseenhanced detection is feasible and when it is not.
To appear in Fluctuation and Noise Letters The Noisy ExpectationMaximization Algorithm
"... We present a noiseinjected version of the ExpectationMaximization (EM) algorithm: the Noisy Expectation Maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that additive noise speeds up the average convergence of the EM ..."
Abstract
 Add to MetaCart
We present a noiseinjected version of the ExpectationMaximization (EM) algorithm: the Noisy Expectation Maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that additive noise speeds up the average convergence of the EM algorithm to a local maximum of the likelihood surface if a positivity condition holds. Corollary results give special cases when noise improves the EM algorithm. We demonstrate these noise benefits on EM algorithms for three data models: the Gaussian mixture model (GMM), the Cauchy mixture model (CMM), and the censored logconvex gamma model. The NEM positivity condition simplifies to a quadratic inequality in the GMM and CMM cases. A final theorem shows that the noise benefit for independent identically distributed additive noise decreases with sample size in mixture models. This theorem implies that the noise benefit is most pronounced if the data is sparse. I.
Contents lists available at SciVerse ScienceDirect Neural Networks
"... journal homepage: www.elsevier.com/locate/neunet ..."