Results 1  10
of
12
Homogeneity vs. adjacency: generalising some graph decomposition algorithms
 In 32nd International Workshop on GraphTheoretic Concepts in Computer Science (WG), volume 4271 of LNCS
, 2006
"... Abstract. In this paper, a new general decomposition theory inspired from modular graph decomposition is presented. Our main result shows that, within this general theory, most of the nice algorithmic tools developed for modular decomposition are still efficient. This theory not only unifies the usu ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract. In this paper, a new general decomposition theory inspired from modular graph decomposition is presented. Our main result shows that, within this general theory, most of the nice algorithmic tools developed for modular decomposition are still efficient. This theory not only unifies the usual modular decomposition generalisations such as modular decomposition of directed graphs and of 2structures, but also decomposition by star cutsets. 1
Finitely Subsequential Transducers
, 2003
"... Finitely subsequential transducers are efficient finitestate transducers with a finite number of final outputs and are used in a variety of applications. Not all transducers admit equivalent finitely subsequential transducers however. We briefly describe an existing generalized determinization al ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Finitely subsequential transducers are efficient finitestate transducers with a finite number of final outputs and are used in a variety of applications. Not all transducers admit equivalent finitely subsequential transducers however. We briefly describe an existing generalized determinization algorithm for finitely subsequential transducers and give the first characterization of finitely subsequentiable transducers, transducers that admit equivalent finitely subsequential transducers. Our characterization shows the existence of an efficient algorithm for testing finite subsequentiability. We have fully implemented the generalized determinization algorithm and the algorithm for testing finite subsequentiability. We report
General algorithms for testing the ambiguity of finite automata
 In Developments in Language Theory, 12th International Conference, volume 5257 of Lecture Notes in Computer Science
, 2008
"... Abstract. This paper presents efficient algorithms for testing the finite, polynomial, and exponential ambiguity of finite automata with ǫtransitions. It gives an algorithm for testing the exponential ambiguity of an automaton A in time O(A  2 E), and finite or polynomial ambiguity in time O(A  ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Abstract. This paper presents efficient algorithms for testing the finite, polynomial, and exponential ambiguity of finite automata with ǫtransitions. It gives an algorithm for testing the exponential ambiguity of an automaton A in time O(A  2 E), and finite or polynomial ambiguity in time O(A  3 E), where AE denotes the number of transitions of A. These complexities significantly improve over the previous best complexities given for the same problem. Furthermore, the algorithms presented are simple and based on a general algorithm for the composition or intersection of automata. We also give an algorithm to determine in time O(A  3 E) the degree of polynomial ambiguity of a polynomially ambiguous automaton A. Finally, we present an application of our algorithms to an approximate computation of the entropy of a probabilistic automaton. 1
Deciding unambiguity and sequentiality from a finitely ambiguous maxplus automaton
 THEORET. COMPUT. SCI
, 2004
"... Finite automata with weights in the maxplus semiring are considered. The main result is: it is decidable whether a series that is recognized by a finitely ambiguous maxplus automaton is unambiguous, or is sequential. Furthermore, the proof is constructive. A collection of examples is given to illu ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Finite automata with weights in the maxplus semiring are considered. The main result is: it is decidable whether a series that is recognized by a finitely ambiguous maxplus automaton is unambiguous, or is sequential. Furthermore, the proof is constructive. A collection of examples is given to illustrate the hierarchy of maxplus series with respect to ambiguity.
Minimum Exact Word Error Training
 In Proc. ASRU
, 2005
"... In this paper we present the Minimum Exact Word Error (exactMWE) training criterion to optimise the parameters of large scale speech recognition systems. The exactMWE criterion is similar to the Minimum Word Error (MWE) criterion but uses the exact word error instead of an approximation based on tim ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In this paper we present the Minimum Exact Word Error (exactMWE) training criterion to optimise the parameters of large scale speech recognition systems. The exactMWE criterion is similar to the Minimum Word Error (MWE) criterion but uses the exact word error instead of an approximation based on time alignments as used in the MWE criterion. It is shown that the exact word error for all word sequence hypotheses can be represented on a word lattice. This can be accomplished using transducerbased methods. The result is a word lattice of slightly refined topology. The accumulated weights of each path through such a lattice then represent the exact number of word errors for the corresponding word sequence hypothesis. Using this compressed representation of the word error of all word sequences represented in the original lattice, exactMWE can be performed using the same latticebased reestimation process as for MWE training. First experiments on the Wall Street Journal dictation task do not show significant differences in recognition performance between exactMWE and MWE at comparable computational complexity and convergence behaviour of the training. 1.
Generalized Optimization Algorithm for Speech Recognition Transducers
 IN PROCEEDINGS OF THE 2003 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP 2003
, 2003
"... Weighted transducers provide a common representation for the components of a speech recognition system. In previous work, we showed that these components can be combined offline into a single compact recognition transducer that maps directly HMM state sequences to word sequences [11]. The construct ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Weighted transducers provide a common representation for the components of a speech recognition system. In previous work, we showed that these components can be combined offline into a single compact recognition transducer that maps directly HMM state sequences to word sequences [11]. The construction of that recognition transducer and its efficiency of use critically depend on the use of a general optimization algorithm, determinization. However, not all weighted automata and transducers used in largevocabulary speech recognition are determinizable. We present a general algorithm that can make an arbitrary weighted transducer determinizable and generalize our previous optimization technique for building an integrated recognition transducer to deal with arbitrary weighted transducers used in speech recognition. We report experimental results in a largevocabulary speech recognition task, How May I Help You (HMIHY), showing that our generalized technique leads to a recognition transducer that performs as well as our original solution in the case of classical ngram models while inserting less special symbols, and that it leads to a substantial improvement of the recognition speed, factor of 2.6, in the same task when using a classbased language model.
Determinization of weighted tree automata using factorizations
 PRESENTATION AT 8TH INT. WORKSHOP FINITESTATE METHODS AND NATURAL LANGUAGE PROCESSING
, 2009
"... We present a determinization construction for weighted tree automata using factorizations. Among others, this result subsumes a previous result for determinization of weighted string automata using factorizations (Kirsten and Mäurer, 2005) and two previous results for weighted tree automata, one of ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We present a determinization construction for weighted tree automata using factorizations. Among others, this result subsumes a previous result for determinization of weighted string automata using factorizations (Kirsten and Mäurer, 2005) and two previous results for weighted tree automata, one of them not using factorizations (Borchardt, 2004) and one of them restricted to nonrecursive automata over the nonnegative reals (May and Knight, 2006).
Rigorous Approximated Determinization of Weighted Automata
"... Abstract—A nondeterministic weighted finite automaton (WFA) maps an input word to a numerical value. Applications of weighted automata include formal verification of quantitative properties, as well as text, speech, and image processing. Many of these applications require the WFAs to be deterministi ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract—A nondeterministic weighted finite automaton (WFA) maps an input word to a numerical value. Applications of weighted automata include formal verification of quantitative properties, as well as text, speech, and image processing. Many of these applications require the WFAs to be deterministic, or work substantially better when the WFAs are deterministic. Unlike NFAs, which can always be determinized, not all WFAs have an equivalent deterministic weighted automaton (DWFA). In [1], Mohri describes a determinization construction for a subclass of WFA. He also describes a property of WFAs (the twins property), such that all WFAs that satisfy the twins property are determinizable and the algorithm terminates on them. Unfortunately, many natural WFAs cannot be determinized. In this paper we study approximated determinization of WFAs. We describe an algorithm that, given a WFA A and an approximation factor t ≥ 1, constructs a DWFA A ′ that tdeterminizes A. Formally, for all words w ∈ Σ ∗ , the value of w in A ′ is at least its value in A and at most t times its value in A. Our construction involves two new ideas: attributing states in the subset construction by both upper and lower residues, and collapsing attributed subsets whose residues can be tightened. The larger the approximation factor is, the more attributed subsets we can collapse. Thus, tdeterminization is helpful not only for WFAs that cannot be determinized, but also in cases determinization is possible but results in automata that are too big to handle. In addition, tdeterminization is useful for reasoning about the competitive ratio of online algorithms. We also describe a property (the ttwins property) and use it in order to characterize tdeterminizable WFAs. Finally, we describe a polynomial algorithm for deciding whether a given WFA has the ttwins property. Index Terms—Weighted automata; Determinization; I.
An Efficient PreDeterminization Algorithm
 CIAA 2003. LNCS
, 2003
"... We present a general algorithm, predeterminization, that makes an arbitrary weighted transducer over the tropical semiring or an arbitrary unambiguous weighted transducer over a cancellative commutative semiring determinizable by inserting in it transitions labeled with special symbols. After deter ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We present a general algorithm, predeterminization, that makes an arbitrary weighted transducer over the tropical semiring or an arbitrary unambiguous weighted transducer over a cancellative commutative semiring determinizable by inserting in it transitions labeled with special symbols. After determinization, the special symbols can be removed or replaced with εtransitions. The resulting transducer can be significantly more efficient to use. We report empirical results showing that our algorithm leads to a substantial speedup in largevocabulary speech recognition. Our predeterminization algorithm makes use of an efficient algorithm for testing a general twins property, a sufficient condition for the determinizability of all weighted transducers over the tropical semiring and unambiguous weighted transducers over cancellative commutative semirings. It inserts new transitions just when needed to guarantee that the resulting transducer has the twins property and thus is determinizable. It also uses a singlesource shortestpaths algorithm over the minmax semiring for carefully selecting the positions for insertion of new transitions to benefit from the subsequent application of determinization. These positions are proved to be optimal in a sense that we describe.
GENERALIZED OPTIMIZATION ALGORITHM FOR SPEECH RECOGNITION TRANSDUCERS
"... Weighted transducers provide a common representation for the components of a speech recognition system. In previous work, we showed that these components can be combined offline into a single compact recognition transducer that maps directly HMM state sequences to word sequences [11]. The construct ..."
Abstract
 Add to MetaCart
Weighted transducers provide a common representation for the components of a speech recognition system. In previous work, we showed that these components can be combined offline into a single compact recognition transducer that maps directly HMM state sequences to word sequences [11]. The construction of that recognition transducer and its efficiency of use critically depend on the use of a general optimization algorithm, determinization. However, not all weighted automata and transducers used in largevocabulary speech recognition are determinizable. We present a general algorithm that can make an arbitrary weighted transducer determinizable and generalize our previous optimization technique for building an integrated recognition transducer to deal with arbitrary weighted transducers used in speech recognition. We report experimental results in a largevocabulary speech recognition task, How May I Help You (HMIHY), showing that our generalized technique leads to a recognition transducer that performs as well as our original solution in the case of classical gram models while inserting less special symbols, and that it leads to a substantial improvement of the recognition speed, factor of ¡£ ¢ ¤ , in the same task when using a classbased language model. 1.