Results 1 
4 of
4
SOM based density function approximation for mixture density HMMs
 In Workshop on SelfOrganizing Maps
, 1997
"... This paper explains how some properties of the SelfOrganizing Maps (SOMs) can be exploited in the density models used in continuous density hidden Markov models (HMMs). The three main ideas are the suitable initialization of the centroids for the Gaussian mixtures, the smoothing of the HMM paramete ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper explains how some properties of the SelfOrganizing Maps (SOMs) can be exploited in the density models used in continuous density hidden Markov models (HMMs). The three main ideas are the suitable initialization of the centroids for the Gaussian mixtures, the smoothing of the HMM parameters and the use of topology for fast density approximations. The methods are tested here in the automatic speech recognition framework, where the task is to decode the phonetic transcription of spoken words by speaker dependent, but vocabulary independent phoneme models. The results show that the average number of final recognition errors will be over 15 % smaller, if the traditional Kmeans based initialization is substituted by SOM. The method described for fast SOM density approximation improves the total recognition time by over 40 % for the current online system compared to the default which uses independent complete searches for the best matching units. 1 About the application The auto...
Comparison Results For Segmental Training Algorithms For Mixture Density Hmms
"... This work presents experiments on four segmental training algorithms for mixture density HMMs. The segmental versions of SOM and LVQ3 suggested by the author are compared against the conventional segmental Kmeans and the segmental GPD. The recognition task used as a test bench is the speaker depend ..."
Abstract
 Add to MetaCart
This work presents experiments on four segmental training algorithms for mixture density HMMs. The segmental versions of SOM and LVQ3 suggested by the author are compared against the conventional segmental Kmeans and the segmental GPD. The recognition task used as a test bench is the speaker dependent, but vocabulary independent automatic speech recognition. The output density function of each state in each model is a mixture of multivariate Gaussian densities. Neural network methods SOM and LVQ are applied to learn the parameters of the density models from the melcepstrum features of the training samples. The segmental training improves the segmentation and the model parameters by turns to obtain the best possible result, because the segmentation and the segment classification depend on each other. It suffices to start the training process by dividing the training samples approximatively into phoneme samples. 1. INTRODUCTION The recognition task used as a test bench for the trainin...
A Survey of Discriminative and Connectionist Methods for Speech Processing
, 2002
"... Discriminative speech processing techniques attempt to compute the maximum a posterior probability of some speech event, such as a particular phoneme being spoken, given the observed data. Nondiscriminative techniques compute the likelihood of the observed data assuming an event. Nondiscriminative ..."
Abstract
 Add to MetaCart
Discriminative speech processing techniques attempt to compute the maximum a posterior probability of some speech event, such as a particular phoneme being spoken, given the observed data. Nondiscriminative techniques compute the likelihood of the observed data assuming an event. Nondiscriminative methods such as simple HMMs (hidden Markov models) achieved success despite their lack of discriminative modelling. This survey will look at enhancements to the HMM model which have improved their discrimination ability and hence their overall performance. This survey also reviews alternative discriminative methods, namely connectionist methods such as ANNs (arti cial neural networks). We will also draw comparisons between discriminative HMMs and connectionist models, showing that connectionist models can be viewed as a generalisation of discriminative HMMs. 1