Results 1  10
of
140
Generation and Synchronous TreeAdjoining Grammars
, 1990
"... Treeadjoining grammars (TAG) have been proposed as a formalism for generation based on the intuition that the extended domain of syntactic locality that TAGs provide should aid in localizing semantic dependencies as well, in turn serving as an aid to generation from semantic representations. We dem ..."
Abstract

Cited by 638 (39 self)
 Add to MetaCart
Treeadjoining grammars (TAG) have been proposed as a formalism for generation based on the intuition that the extended domain of syntactic locality that TAGs provide should aid in localizing semantic dependencies as well, in turn serving as an aid to generation from semantic representations. We demonstrate that this intuition can be made concrete by using the formalism of synchronous treeadjoining grammars. The use of synchronous TAGs for generation provides solutions to several problems with previous approaches to TAG generation. Furthermore, the semantic monotonicity requirement previously advocated for generation gram mars as a computational aid is seen to be an inherent property of synchronous TAGs.
The induction of dynamical recognizers
 Machine Learning
, 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the le ..."
Abstract

Cited by 210 (14 self)
 Add to MetaCart
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning process illustrates a new form of mechanical inference: Induction by phase transition. A small weight adjustment causes a "bifurcation" in the limit behavior of the network. This phase transition corresponds to the onset of the network’s capacity for generalizing to arbitrarylength strings. Second, a study of the automata resulting from the acquisition of previously published training sets indicates that while the architecture is not guaranteed to find a minimal finite automaton consistent with the given exemplars, which is an NPHard problem, the architecture does appear capable of generating nonregular languages by exploiting fractal and chaotic dynamics. I end the paper with a hypothesis relating linguistic generative capacity to the behavioral regimes of nonlinear dynamical systems.
The Computational Analysis of the Syntax and Interpretation of "Free" Word Order in Turkish
, 1995
"... ..."
The language of RNA: A formal grammar that includes pseudoknots
 Bioinformatics
"... Motivation: In a previous paper, we presented a polynomial time dynamic programming algorithm for predicting optimal RNA secondary structure including pseudoknots. However a formal grammatical representation for RNA secondary structure with pseudoknots was still lacking. Results: Here we show a one ..."
Abstract

Cited by 61 (1 self)
 Add to MetaCart
Motivation: In a previous paper, we presented a polynomial time dynamic programming algorithm for predicting optimal RNA secondary structure including pseudoknots. However a formal grammatical representation for RNA secondary structure with pseudoknots was still lacking. Results: Here we show a onetoone correspondence between that algorithm and a formal transformational grammar. This grammar class encompasses the contextfree grammars and goes beyond to generate pseudoknotted structures. The pseudoknot grammar avoids the use of general contextsensitive rules by introducing a small number of auxiliary symbols used to reorder the strings generated by an otherwise contextfree grammar. This formal representation of the residue correlations in RNA structure is important because it means we can build full probabilistic models of RNA secondary structure, including pseudoknots, and use them to optimally parse sequences in polynomial time. Contact: eddy@genetics.wustl.edu 1 ...
A survey of statistical machine translation
, 2007
"... Statistical machine translation (SMT) treats the translation of natural language as a machine learning problem. By examining many samples of humanproduced translation, SMT algorithms automatically learn how to translate. SMT has made tremendous strides in less than two decades, and many popular tec ..."
Abstract

Cited by 55 (4 self)
 Add to MetaCart
Statistical machine translation (SMT) treats the translation of natural language as a machine learning problem. By examining many samples of humanproduced translation, SMT algorithms automatically learn how to translate. SMT has made tremendous strides in less than two decades, and many popular techniques have only emerged within the last few years. This survey presents a tutorial overview of stateoftheart SMT at the beginning of 2007. We begin with the context of the current research, and then move to a formal problem description and an overview of the four main subproblems: translational equivalence modeling, mathematical modeling, parameter estimation, and decoding. Along the way, we present a taxonomy of some different approaches within these areas. We conclude with an overview of evaluation and notes on future directions.
Bilexical Grammars And Their CubicTime Parsing Algorithms
 IN: NEW DEVELOPMENTS IN NATURAL LANGUAGE PARSING
, 2000
"... This chapter introduces weighted bilexical grammars, a formalism in which individual lexical items, such as verbs and their arguments, can have idiosyncratic selectional influences on each other. Such ‘bilexicalism ’ has been a theme of much current work in parsing. The new formalism can be used t ..."
Abstract

Cited by 51 (1 self)
 Add to MetaCart
This chapter introduces weighted bilexical grammars, a formalism in which individual lexical items, such as verbs and their arguments, can have idiosyncratic selectional influences on each other. Such ‘bilexicalism ’ has been a theme of much current work in parsing. The new formalism can be used to describe bilexical approaches to both dependency and phrasestructure grammars, and a slight modification yields link grammars. Its scoring approach is compatible with a wide variety of probability models. The obvious parsing algorithm for bilexical grammars (used by most previous authors) takes time O(n^5). A more efficient O(n³) method is exhibited. The new algorithm has been implemented and used in a large parsing experiment (Eisner, 1996b). We also give a useful extension to the case where the parser must undo a stochastic transduction that has altered the input.
Derivational Minimalism is Mildly ContextSensitive
 In Proceedings, Logical Aspects of Computational Linguistics, LACL’98
, 1998
"... this paper we address the issue by showing that each MG as deøned in [3] falls into the class of mildly contextsensitive grammars (MCSGs) as described in e.g. [1]. The proof of our claim is essentially done by converting a given MG into a linear contextfree rewriting system (LCFRS) which derives the ..."
Abstract

Cited by 48 (12 self)
 Add to MetaCart
this paper we address the issue by showing that each MG as deøned in [3] falls into the class of mildly contextsensitive grammars (MCSGs) as described in e.g. [1]. The proof of our claim is essentially done by converting a given MG into a linear contextfree rewriting system (LCFRS) which derives the same (string) language.
String Variable Grammar: A Logic Grammar Formalism For The Biological Language Of DNA
, 1993
"... this paper, we present a generalized form of SVG, which supports additional biologicallyrelevant operations by going beyond homomorphisms, instead uniformly applying substitutions in either a forward or reverse direction (see Definition 2.1) to bindings of logic variables. We give a constructive pr ..."
Abstract

Cited by 44 (2 self)
 Add to MetaCart
this paper, we present a generalized form of SVG, which supports additional biologicallyrelevant operations by going beyond homomorphisms, instead uniformly applying substitutions in either a forward or reverse direction (see Definition 2.1) to bindings of logic variables. We give a constructive proof of our conjecture [26] that the languages describable by SVG are contained in the indexed languages, and furthermore show that the containment is proper, thus refining the position of an important class of biological sequences in the hierarchy of languages. We also describe a simple grammar translator, give a number of examples of mathematical and biological languages, discuss the distinctions between SVG, DG, TAG, and RPDAs, and suggest extensions wellsuited to the overlapping languages of genes. Finally, we describe a largescale implementation of a domainspecific parser called GenLang which incorporates a practical version of these ideas, and which has been successful in parsing several types of genes from DNA sequence data [9, 30], in a form of patternmatching search termed syntactic pattern recognition [10]. 6 2. STRING VARIABLE GRAMMAR
Stochastic Lexicalized ContextFree Grammar
, 1993
"... Stochastic lexicalized contextfree grammar (SLCFG) is an attractive compromise between the parsing efficiency of stochastic contextfree grammar (SCFG) and the lexical sensitivity of stochastic lexicalized treeadjoining grammar (SLTAG). SLCFG is a restricted form of SLTAG that can only generate ..."
Abstract

Cited by 41 (6 self)
 Add to MetaCart
Stochastic lexicalized contextfree grammar (SLCFG) is an attractive compromise between the parsing efficiency of stochastic contextfree grammar (SCFG) and the lexical sensitivity of stochastic lexicalized treeadjoining grammar (SLTAG). SLCFG is a restricted form of SLTAG that can only generate contextfree languages and can be parsed in cubic time. However, SLCFG retains the lexical sensitivity of SLTAG and is therefore a much better basis for capturing distributional information about words than SCFG.