## Training Mixture Density HMMs with SOM and LVQ (1997)

Citations: | 4 - 2 self |

### BibTeX

@MISC{Kurimo97trainingmixture,

author = {Mikko Kurimo},

title = {Training Mixture Density HMMs with SOM and LVQ},

year = {1997}

}

### OpenURL

### Abstract

The objective of this paper is to present experiments and discussions of how some neural network algorithms can help the phoneme recognition with mixture density hidden Markov models (MDHMMs). In MDHMMs the modeling of the stochastic observation processes associated with the states is based on the estimation of the probability density function of the short-time observations in each state as a mixture of Gaussian densities. The Learning Vector Quantization (LVQ) is used to increase the discrimination between different phoneme models both during the initialization of the Gaussian codebooks and during the actual MDHMM training. The Self-Organizing Map (SOM) is applied to provide a suitably smoothed mapping of the training vectors to accelerate the convergence of the actual training. The obtained codebook topology can also be exploited in the recognition phase to speed up the calculations to approximate the observation probabilities. The experiments with LVQ and SOMs show reductions both...