## Arc Minimization in Finite State Decoding Graphs with Cross-Word Acoustic Context (2002)

Venue: | In Proc. ICSLP’02 |

Citations: | 6 - 2 self |

### BibTeX

@INPROCEEDINGS{Zweig02arcminimization,

author = {G. Zweig and G. Saon and F. Yvon},

title = {Arc Minimization in Finite State Decoding Graphs with Cross-Word Acoustic Context},

booktitle = {In Proc. ICSLP’02},

year = {2002}

}

### OpenURL

### Abstract

Recent approaches to large vocabulary decoding with finite state graphs have focused on the use of state minimization algorithms to produce relatively compact graphs. This paper extends the finite state approach by developing complementary arc-minimization techniques. The use of these techniques in concert with state minimization allows us to statically compile decoding graphs in which the acoustic models utilize a full word of cross-word context. This is in significant contrast to typical systems which use only a single phone. We show that the particular arc-minimization problem that arises is in fact an NP-complete combinatorial optimization problem, and describe the reduction from 3-SAT. We present experimental results that illustrate the moderate sizes and runtimes of graphs for the Switchboard task. 1.

### Citations

1262 |
Error bounds for convolution codes and an asymptotically optimal decoding algorithm
- Viterbi
- 1967
(Show Context)
Citation Context ...ompact grammar is appropriate, it is common to pre-compile a static state-graph. Given such a graph, a simple and efficient implementation of the Viterbi algorithm can be used for subsequent decoding =-=[1]-=-. For large vocabulary tasks with ngram language models, however, it has traditionally been common to avoid a static search space, and to instead dynamically expand the language model as needed [2, 3,... |

700 | Estimation of probabilities from sparse data for the language model component of a speech recognizer
- Katz
- 1987
(Show Context)
Citation Context ...aph structure induced by a -gram language model with backoff smoothing. At first, we do not consider any cross-word acoustic context. When LM probabilities are smoothed according to a back-off scheme =-=[8]-=-, this automaton can be efficiently factored [9] as follows. Each history appearing in the LM corresponds to a history state in the FSM; each word such that occurs in the LM corresponds to a transitio... |

155 | Weighted finite-state transducers in speech recognition
- Mohri, Pereira, et al.
- 2002
(Show Context)
Citation Context ...ast several years, algorithmic and computational advances have made it possible to handle large vocabulary recognition in essentially the same way as grammar-based tasks. In a recent series of papers =-=[5, 6]-=-, it has been shown that it is in fact possible to statically compile a state graph that encodes the constraints of both a state-of-the-art language model, and cross-word acoustic context. One of the ... |

130 | The use of context in large vocabulary speech recognition
- Odell
- 1995
(Show Context)
Citation Context ...ng [1]. For large vocabulary tasks with ngram language models, however, it has traditionally been common to avoid a static search space, and to instead dynamically expand the language model as needed =-=[2, 3, 4]-=-. While the latter approach has the advantage of never touching potentially large portions of the search space, it has the important disadvantage that dynamic expansion is significantly more complex, ... |

64 |
The design of a linguistic statistical decoder for the recognition of continuous speech
- Jelinek, Bahl, et al.
- 1975
(Show Context)
Citation Context ...ng [1]. For large vocabulary tasks with ngram language models, however, it has traditionally been common to avoid a static search space, and to instead dynamically expand the language model as needed =-=[2, 3, 4]-=-. While the latter approach has the advantage of never touching potentially large portions of the search space, it has the important disadvantage that dynamic expansion is significantly more complex, ... |

53 | Dynamic programming search for continuous speech recognition
- Ney, Ortmanns
- 1999
(Show Context)
Citation Context ...ng [1]. For large vocabulary tasks with ngram language models, however, it has traditionally been common to avoid a static search space, and to instead dynamically expand the language model as needed =-=[2, 3, 4]-=-. While the latter approach has the advantage of never touching potentially large portions of the search space, it has the important disadvantage that dynamic expansion is significantly more complex, ... |

46 |
Stochastic Automata for Language Modeling
- Riccardi, Pieraccini, et al.
- 1996
(Show Context)
Citation Context ...with backoff smoothing. At first, we do not consider any cross-word acoustic context. When LM probabilities are smoothed according to a back-off scheme [8], this automaton can be efficiently factored =-=[9]-=- as follows. Each history appearing in the LM corresponds to a history state in the FSM; each word such that occurs in the LM corresponds to a transition weighted with , labeled with , between and , w... |

41 | Full expansion of context dependent networks in large vocabulary speech recognition
- Mohri, Riley, et al.
- 1998
(Show Context)
Citation Context ...ast several years, algorithmic and computational advances have made it possible to handle large vocabulary recognition in essentially the same way as grammar-based tasks. In a recent series of papers =-=[5, 6]-=-, it has been shown that it is in fact possible to statically compile a state graph that encodes the constraints of both a state-of-the-art language model, and cross-word acoustic context. One of the ... |

11 |
and Rajeev Motwani. Clique partitions, graph compression and speeding-up algorithms
- Feder
- 1995
(Show Context)
Citation Context ...for state minimization, it is reasonable to ask if similar procedures exist for arc minimization. However, through a reduction from the known NP-complete optimization problem of Clique Bipartitioning =-=[7]-=-, we demonstrate that in fact the problem we are faced with is NP-complete. In this respect it is similar to state minimization - which in the worst case requires exponential time in the determinizati... |

1 |
The NP-completeness of some edge partition problems,” Siam
- Holyer
- 1981
(Show Context)
Citation Context ...m a result of [7], which proves than finding the minimum order partition into bicliques for any given graph is NP-hard. This proof relies on a variation of a reduction of 3-SAT originally proposed in =-=[10]-=-, and proceeds along the following lines. They first exhibit a polynomial transformation of any formula‹ into a graph t y ‹Z} such thatt y ‹Z} only contains ’simple’ bicliques: the only bicliques incl... |