Results 11  20
of
370
Learning Synchronous Grammars for Semantic Parsing with Lambda Calculus
, 2007
"... This paper presents the first empirical results to our knowledge on learning synchronous grammars that generate logical forms. Using statistical machine translation techniques, a semantic parser based on a synchronous contextfree grammar augmented with λoperators is learned given a set of training ..."
Abstract

Cited by 66 (5 self)
 Add to MetaCart
This paper presents the first empirical results to our knowledge on learning synchronous grammars that generate logical forms. Using statistical machine translation techniques, a semantic parser based on a synchronous contextfree grammar augmented with λoperators is learned given a set of training sentences and their correct logical forms. The resulting parser is shown to be the bestperforming system so far in a database query domain.
Learning Dependency Translation Models as Collections of Finite State Head Transducers
 Computational Linguistics
, 2000
"... The paper defines weighted head transducers,finitestate machines that perform middleout string transduction. These transducers are strictly more expressive than the special case of standard lefttoright finitestate transducers. Dependency transduction models are then defined as collections of wei ..."
Abstract

Cited by 66 (3 self)
 Add to MetaCart
The paper defines weighted head transducers,finitestate machines that perform middleout string transduction. These transducers are strictly more expressive than the special case of standard lefttoright finitestate transducers. Dependency transduction models are then defined as collections of weighted head transducers that are applied hierarchically. A dynamic programming search algorithm is described for finding the optimal transduction of an input string with respect to a dependency transduction model. A method for automatically training a dependency transduction model from a set of inputoutput example strings is presented. The method first searches for hierarchical alignments of the training examples guided by correlation statistics, and then constructs the transitions of head transducers that are consistent with these alignments. Experimental results are given for applying the training method to translation from English to Spanish and Japanese. 1.
Dependency parsing by belief propagation
 In Proceedings of EMNLP
, 2008
"... We formulate dependency parsing as a graphical model with the novel ingredient of global constraints. We show how to apply loopy belief propagation (BP), a simple and effective tool for approximate learning and inference. As a parsing algorithm, BP is both asymptotically and empirically efficient. E ..."
Abstract

Cited by 65 (7 self)
 Add to MetaCart
We formulate dependency parsing as a graphical model with the novel ingredient of global constraints. We show how to apply loopy belief propagation (BP), a simple and effective tool for approximate learning and inference. As a parsing algorithm, BP is both asymptotically and empirically efficient. Even with secondorder features or latent variables, which would make exact parsing considerably slower or NPhard, BP needs only O(n3) time with a small constant factor. Furthermore, such features significantly improve parse accuracy over exact firstorder methods. Incorporating additional features would increase the runtime additively rather than multiplicatively. 1
Forestbased translation
 In Proceedings of ACL08: HLT
, 2008
"... Among syntaxbased translation models, the treebased approach, which takes as input a parse tree of the source sentence, is a promising direction being faster and simpler than its stringbased counterpart. However, current treebased systems suffer from a major drawback: they only use the 1best pa ..."
Abstract

Cited by 64 (20 self)
 Add to MetaCart
Among syntaxbased translation models, the treebased approach, which takes as input a parse tree of the source sentence, is a promising direction being faster and simpler than its stringbased counterpart. However, current treebased systems suffer from a major drawback: they only use the 1best parse to direct the translation, which potentially introduces translation mistakes due to parsing errors. We propose a forestbased approach that translates a packed forest of exponentially many parses, which encodes many more alternatives than standard nbest lists. Largescale experiments show an absolute improvement of 1.7 BLEU points over the 1best baseline. This result is also 0.8 points higher than decoding with 30best parses, and takes even less time. 1
Statistical Machine Translation by Parsing
, 2004
"... In an ordinary syntactic parser, the input is a string, and the grammar ranges over strings. This paper explores generalizations of ordinary parsing algorithms that allow the input to consist of string tuples and/or the grammar to range over string tuples. Such algorithms can infer the synchronous s ..."
Abstract

Cited by 64 (6 self)
 Add to MetaCart
In an ordinary syntactic parser, the input is a string, and the grammar ranges over strings. This paper explores generalizations of ordinary parsing algorithms that allow the input to consist of string tuples and/or the grammar to range over string tuples. Such algorithms can infer the synchronous structures hidden in parallel texts. It turns out that these generalized parsers can do most of the work required to train and apply a syntaxaware statistical machine translation system.
Machine Translation Using Probabilistic Synchronous Dependency Insertion Grammars
, 2005
"... Syntaxbased statistical machine translation (MT) aims at applying statistical models to structured data. In this paper, we present a syntaxbased statistical machine translation system based on a probabilistic synchronous dependency insertion grammar. Synchronous dependency insertion grammars are a ..."
Abstract

Cited by 62 (0 self)
 Add to MetaCart
Syntaxbased statistical machine translation (MT) aims at applying statistical models to structured data. In this paper, we present a syntaxbased statistical machine translation system based on a probabilistic synchronous dependency insertion grammar. Synchronous dependency insertion grammars are a version of synchronous grammars defined on dependency trees. We first introduce our approach to inducing such a grammar from parallel corpora. Second, we describe the graphical model for the machine translation task, which can also be viewed as a stochastic treetotree transducer. We introduce a polynomial time decoding algorithm for the model. We evaluate the outputs of our MT system using the NIST and Bleu automatic MT evaluation software. The result shows that our system outperforms the baseline system based on the IBM models in both translation speed and quality.
Discriminative reranking for machine translation
 In HLTNAACL 2004
, 2004
"... This paper describes the application of discriminative reranking techniques to the problem of machine translation. For each sentence in the source language, we obtain from a baseline statistical machine translation system, a ranked nbest list of candidate translations in the target language. We intr ..."
Abstract

Cited by 62 (1 self)
 Add to MetaCart
This paper describes the application of discriminative reranking techniques to the problem of machine translation. For each sentence in the source language, we obtain from a baseline statistical machine translation system, a ranked nbest list of candidate translations in the target language. We introduce two novel perceptroninspired reranking algorithms that improve on the quality of machine translation over the baseline system based on evaluation using the BLEU metric. We provide experimental results on the NIST 2003 ChineseEnglish large data track evaluation. We also provide theoretical analysis of our algorithms and experiments that verify that our algorithms provide stateoftheart performance in machine translation. 1
Maximum Entropy Based Phrase Reordering Model for Statistical Machine Translation
 In Proc. of COLINGACL
, 2006
"... We propose a novel reordering model for phrasebased statistical machine translation (SMT) that uses a maximum entropy (MaxEnt) model to predicate reorderings of neighbor blocks (phrase pairs). The model provides contentdependent, hierarchical phrasal reordering with generalization based on feature ..."
Abstract

Cited by 58 (13 self)
 Add to MetaCart
We propose a novel reordering model for phrasebased statistical machine translation (SMT) that uses a maximum entropy (MaxEnt) model to predicate reorderings of neighbor blocks (phrase pairs). The model provides contentdependent, hierarchical phrasal reordering with generalization based on features automatically learned from a realworld bitext. We present an algorithm to extract all reordering events of neighbor blocks from bilingual data. In our experiments on ChinesetoEnglish translation, this MaxEntbased reordering model obtains significant improvements in BLEU score on the NIST MT05 and IWSLT04 tasks. 1
A generalized CYK algorithm for parsing stochastic CFG
, 1998
"... We present a bottomup parsing algorithm for stochastic contextfree grammars that is able (1) to deal with multiple interpretations of sentences containing compound words; (2) to extract Nmost probable parses in O(n 3 ) and compute at the same time all possible parses of any portion of the inpu ..."
Abstract

Cited by 57 (11 self)
 Add to MetaCart
We present a bottomup parsing algorithm for stochastic contextfree grammars that is able (1) to deal with multiple interpretations of sentences containing compound words; (2) to extract Nmost probable parses in O(n 3 ) and compute at the same time all possible parses of any portion of the input sequence with their probabilities; (3) to deal with #out of vocabulary# words. Explicitly extracting all the parse trees associated to a given input sentence depends on the complexity of the grammar, but even in the case where this number is exponential in n, the chart used by the algorithm for the representation is of O(n 2 ) space complexity. 1 Introduction This article presents CYK+, a bottomup parsing algorithm for stochastic contextfree grammars that is able: 1. to deal multiple interpretations of sentences containing compound words; 2. to extract Nmost probable parses in O(n 3 ) and compute at the same time all possible parses of any portion of the input sequence with their p...
A survey of statistical machine translation
, 2007
"... Statistical machine translation (SMT) treats the translation of natural language as a machine learning problem. By examining many samples of humanproduced translation, SMT algorithms automatically learn how to translate. SMT has made tremendous strides in less than two decades, and many popular tec ..."
Abstract

Cited by 52 (4 self)
 Add to MetaCart
Statistical machine translation (SMT) treats the translation of natural language as a machine learning problem. By examining many samples of humanproduced translation, SMT algorithms automatically learn how to translate. SMT has made tremendous strides in less than two decades, and many popular techniques have only emerged within the last few years. This survey presents a tutorial overview of stateoftheart SMT at the beginning of 2007. We begin with the context of the current research, and then move to a formal problem description and an overview of the four main subproblems: translational equivalence modeling, mathematical modeling, parameter estimation, and decoding. Along the way, we present a taxonomy of some different approaches within these areas. We conclude with an overview of evaluation and notes on future directions.