Results 1  10
of
31
Statistical syntaxdirected translation with extended domain of locality
 In Proc. AMTA 2006
, 2006
"... A syntaxdirected translator first parses the sourcelanguage input into a parsetree, and then recursively converts the tree into a string in the targetlanguage. We model this conversion by an extended treetostring transducer that have multilevel trees on the sourceside, which gives our system m ..."
Abstract

Cited by 122 (14 self)
 Add to MetaCart
(Show Context)
A syntaxdirected translator first parses the sourcelanguage input into a parsetree, and then recursively converts the tree into a string in the targetlanguage. We model this conversion by an extended treetostring transducer that have multilevel trees on the sourceside, which gives our system more expressive power and flexibility. We also define a direct probability model and use a lineartime dynamic programming algorithm to search for the best derivation. The model is then extended to the general loglinear framework in order to rescore with other features like ngram language models. We devise a simpleyeteffective algorithm to generate nonduplicate kbest translations for ngram rescoring. Initial experimental results on EnglishtoChinese translation are presented. 1
Stochastic Inference of Regular Tree Languages
 Machine Learning
, 1999
"... this paper, we introduce a modi cation of the last algorithm that can be trained with positive samples generated according to a probabilistic production scheme. The construction follows the same guidelines as the algorithm for string languages in Carrasco and Oncina (1998). A dierent approach (Sakak ..."
Abstract

Cited by 47 (11 self)
 Add to MetaCart
this paper, we introduce a modi cation of the last algorithm that can be trained with positive samples generated according to a probabilistic production scheme. The construction follows the same guidelines as the algorithm for string languages in Carrasco and Oncina (1998). A dierent approach (Sakakibara et al., 1994) generalizes the
Towards robustness in parsing  fuzzifying contextfree language recognition
 Developments in Language Theory II  At the Crossroad of Mathematics, Computer Science and Biology
, 1996
"... We discuss the concept of robustness with respect to parsing or recognizing a contextfree language. Our approach is based on the notions of fuzzy language, (generalized) fuzzy contextfree grammar, and parser/recognizer for fuzzy languages. As concrete examples we consider a robust version of Cocke– ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
(Show Context)
We discuss the concept of robustness with respect to parsing or recognizing a contextfree language. Our approach is based on the notions of fuzzy language, (generalized) fuzzy contextfree grammar, and parser/recognizer for fuzzy languages. As concrete examples we consider a robust version of Cocke–Younger–Kasami’s algorithm and a robust kind of recursive descent recognizer.
A fuzzy approach to erroneous inputs in contextfree language recognition
 Dept. of Comp. Sci., Twente University of Technology
, 1995
"... Abstract − Using fuzzy contextfree grammars one can easily describe a finite number of ways to derive incorrect strings together with their degree of correctness. However, in general there is an infinite number of ways to perform a certain task wrongly. In this paper we introduce a generalization o ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
(Show Context)
Abstract − Using fuzzy contextfree grammars one can easily describe a finite number of ways to derive incorrect strings together with their degree of correctness. However, in general there is an infinite number of ways to perform a certain task wrongly. In this paper we introduce a generalization of fuzzy contextfree grammars, the socalled fuzzy contextfree Kgrammars, to model the situation of making a finite choice out of an infinity of possible grammatical errors during each contextfree derivation step. Under minor assumptions on the parameter K this model happens to be a very general framework to describe correctly as well as erroneously derived sentences by a single generating mechanism. Our first result characterizes the generating capacity of these fuzzy contextfree Kgrammars. As consequences we obtain: (i) bounds on modeling grammatical errors within the framework of fuzzy contextfree grammars, and (ii) the fact that the family of languages generated by fuzzy contextfree Kgrammars shares closure properties very similar to those of the family of ordinary contextfree languages. The second part of the paper is devoted to a few algorithms to recognize fuzzy contextfree languages: viz. a variant of a functional version of Cocke−Younger− Kasami’s algorithm and some recursive descent algorithms. These algorithms turn out to be robust in some very elementary sense and they can easily be extended to corresponding parsing algorithms. 1.
Open source graph transducer interpreter and grammar development environment
 In Proc. LREC
, 2010
"... Graph and tree transducers have been applied in many NLP areas—among them, machine translation, summarization, parsing, and text generation. In particular, the successful use of tree rewriting transducers for the introduction of syntactic structures in statistical machine translation contributed to ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Graph and tree transducers have been applied in many NLP areas—among them, machine translation, summarization, parsing, and text generation. In particular, the successful use of tree rewriting transducers for the introduction of syntactic structures in statistical machine translation contributed to their popularity. However, the potential of such transducers is limited because they do not handle graphs and because they ”consume ” the source structure in that they rewrite it instead of leaving it intact for intermediate consultations. In this paper, we describe an open source tree and graph transducer interpreter, which combines the advantages of graph transducers and twotape Finite State Transducers and surpasses the limitations of stateoftheart tree rewriting transducers. Along with the transducer, we present a graph grammar development environment that supports the compilation and maintenance of graph transducer grammatical and lexical resources. Such an environment is indispensable for any effort to create consistent large coverage NLPresources by human experts. 1.
Inferring Finite Transducers
, 1999
"... We consider the inference problem for finite transducers using different kinds of samples (positive and negative samples, positive samples only, and structural samples). Given pairs of input and output words, our task is to infer the finite transducer consistent with the given pairs. We show that th ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We consider the inference problem for finite transducers using different kinds of samples (positive and negative samples, positive samples only, and structural samples). Given pairs of input and output words, our task is to infer the finite transducer consistent with the given pairs. We show that this problem can be solved in certain special cases by using known results on the inference problem for linear languages.
CHR as grammar formalism, a first report
 Proc. Sixth Annual Workshop of the ERCIM Working group on Constraints
"... ..."
(Show Context)
Timebounded controlled bidirectional grammars
 International Journal of Computer Mathematics
, 1990
"... Abstract — We investigate contextfree grammars the rules of which can be used in a productive and in a reductive fashion, while the application of these rules is controlled by a regular language. We distinguish several modes of derivation for this kind of grammar. The resulting language families (p ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Abstract — We investigate contextfree grammars the rules of which can be used in a productive and in a reductive fashion, while the application of these rules is controlled by a regular language. We distinguish several modes of derivation for this kind of grammar. The resulting language families (properly) extend the family of contextfree languages. We establish some closure properties of these language families and some grammatical transformations which yield a few normal forms for this type of grammar. Finally, we consider some special cases (viz. the contextfree grammar is linear or leftlinear), and generalizations, in particular, the use of arbitrary rather than regular control languages. KEY WORDS: controlled contextfree grammar, production and reduction (i.e. reversed production), mode of derivation, normal form, closure
Derivation of a typed functional LR parser
, 2003
"... Abstract. This paper describes a purely functional implementation of LR parsing. We formally derive our parsers in a series of steps starting from the inverse of printing. In contrast to traditional implementations of LR parsing, the resulting parsers are fully typed, stackless and tablefree. The ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract. This paper describes a purely functional implementation of LR parsing. We formally derive our parsers in a series of steps starting from the inverse of printing. In contrast to traditional implementations of LR parsing, the resulting parsers are fully typed, stackless and tablefree. The parsing functions pursue alternatives in parallel with each alternative represented by a continuation argument. The direct implementation presents many opportunities for optimization and initial measurements show excellent performance in comparison with conventional tabledriven parsers.