Results 1 
3 of
3
An Inequality for Rational Functions with Applications to Some Statistical Estimation Problems
 IEEE Trans. on Information Theory
, 1991
"... AbstractThe wellknown BaumEagon inequality I31 provides an effective iterative scheme for finding a local maximum for homogeneous polynomials with positive coefticients over a domain of probability values. However, in many applications we are interested in maximizing a general rational function. ..."
Abstract

Cited by 105 (4 self)
 Add to MetaCart
AbstractThe wellknown BaumEagon inequality I31 provides an effective iterative scheme for finding a local maximum for homogeneous polynomials with positive coefticients over a domain of probability values. However, in many applications we are interested in maximizing a general rational function. We extend the BaumEagon inequality to rational functions. We briefly describe some of the applications of this inequality to statistical estimation problems. Index TermsNonlinear optimization, statistical estimation, hidden Markov models, speech recognition. I.
Speech Recognition And The Frequency Of Recently Used Words: A Modified Markov Model For Natural Language
, 1988
"... Speech recognition systems incorporate a language model which, at each stage of the recognition task, assigns a probability of occurrence to each word in the vocabulary. A class of Markov language models identified by Jclinck has achieved considerable success in this domain. A modification of the Ma ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
Speech recognition systems incorporate a language model which, at each stage of the recognition task, assigns a probability of occurrence to each word in the vocabulary. A class of Markov language models identified by Jclinck has achieved considerable success in this domain. A modification of the Markov approach, which assigns higher probabilities to recently used words, is proposed and tested against a pure Markov model. Parameter calculation and comparison of the two models both involve use of the LOB Corpus of tagged modern English.
Language Modeling With Stochastic Automata
, 1996
"... It is well known that language models are effective for increasing accuracy of speech and handwriting recognizers, but large language models are often required to achieve low model perplexity (or entropy) and still have adequate language coverage. We study three efficient methods for stochastic lang ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
It is well known that language models are effective for increasing accuracy of speech and handwriting recognizers, but large language models are often required to achieve low model perplexity (or entropy) and still have adequate language coverage. We study three efficient methods for stochastic language modeling in the context of the stochastic pattern recognition problem and give results of a comparative performance analysis. In addition we show that a method which combines two of these language modeling techniques yields even better performance than the best of the single techniques tested.