Results 1 
3 of
3
Speech Recognition by Composition of Weighted Finite Automata
 FINITESTATE LANGUAGE PROCESSING
, 1996
"... We present a general framework based on weighted finite automata and weighted finitestate transducers for describing and implementing speech recognizers. The framework allows us to represent uniformly the information sources and data structures used in recognition, including contextdependent u ..."
Abstract

Cited by 124 (12 self)
 Add to MetaCart
We present a general framework based on weighted finite automata and weighted finitestate transducers for describing and implementing speech recognizers. The framework allows us to represent uniformly the information sources and data structures used in recognition, including contextdependent units, pronunciation dictionaries, language models and lattices. Furthermore, general but efficient algorithms can used for combining information sources in actual recognizers and for optimizing their application. In particular, a single composition algorithm is used both to combine in advance information sources such as language models and dictionaries, and to combine acoustic observations and information sources dynamically during recognition.
On the Determinization of Weighted Finite Automata
 SIAM J. Comput
, 1998
"... . We study determinization of weighted finitestate automata (WFAs), which has important applications in automatic speech recognition (ASR). We provide the first polynomialtime algorithm to test for the twins property, which determines if a WFA admits a deterministic equivalent. We also provide ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
. We study determinization of weighted finitestate automata (WFAs), which has important applications in automatic speech recognition (ASR). We provide the first polynomialtime algorithm to test for the twins property, which determines if a WFA admits a deterministic equivalent. We also provide a rigorous analysis of a determinization algorithm of Mohri, with tight bounds for acyclic WFAs. Given that WFAs can expand exponentially when determinized, we explore why those used in ASR tend to shrink. The folklore explanation is that ASR WFAs have an acyclic, multipartite structure. We show, however, that there exist such WFAs that always incur exponential expansion when determinized. We then introduce a class of WFAs, also with this structure, whose expansion depends on the weights: some weightings cause them to shrink, while others, including random weightings, cause them to expand exponentially. We provide experimental evidence that ASR WFAs exhibit this weight dependence. ...
To appear in SIAM Journal on Computing c ○ SIAM 2000 ON THE DETERMINIZATION OF WEIGHTED FINITE AUTOMATA ∗
"... Abstract. We study the problem of constructing the deterministic equivalent of a nondeterministic weighted finitestate automaton (WFA). Determinization of WFAs has important applications in automatic speech recognition (ASR). We provide the first polynomialtime algorithm to test for the twins prop ..."
Abstract
 Add to MetaCart
Abstract. We study the problem of constructing the deterministic equivalent of a nondeterministic weighted finitestate automaton (WFA). Determinization of WFAs has important applications in automatic speech recognition (ASR). We provide the first polynomialtime algorithm to test for the twins property, which determines if a WFA admits a deterministic equivalent. We also give upper bounds on the size of the deterministic equivalent; the bound is tight in the case of acyclic WFAs. Previously, Mohri presented a superpolynomial time algorithm to test for the twins property, and he also gave an algorithm to determinize WFAs. He showed that the latter runs in time linear in the size of the output when a deterministic equivalent exists; otherwise, it does not terminate. Our bounds imply an upper bound on the running time of this algorithm. Given that WFAs can expand exponentially in size when determinized, we explore why those that occur in ASR tend to shrink when determinized. According to ASR folklore, this phenomenon is attributable solely to the fact that ASR WFAs have simple topology, in particular, that they are acyclic and layered. We introduce a very simple class of WFAs with this structure, but we show that the expansion under determinization depends on the transition weights: some weightings cause them to shrink, while others, including random weightings, cause them to expand exponentially. We provide experimental evidence that ASR WFAs exhibit this weight dependence. That they shrink when determinized, therefore, is a result of favorable weightings in addition to special topology. These analyses and observations have been used to design a new, approximate WFA determinization algorithm, reported in a separate paper along with experimental results showing that it achieves significant WFA size reduction with negligible impact on ASR performance.