Results 1  10
of
32
Parsing Some Constrained Grammar Formalisms
 Computational Linguistics
, 1994
"... this paper we present a scheme to extend a recognition algorithm for ContextFree Grammars (CFG) that can be used to derive polynomialtime recognition algorithms for a set of formalisms that generate a superset of languages generated by CFG. We describe the scheme by developing a CockeKasamiYoung ..."
Abstract

Cited by 56 (6 self)
 Add to MetaCart
this paper we present a scheme to extend a recognition algorithm for ContextFree Grammars (CFG) that can be used to derive polynomialtime recognition algorithms for a set of formalisms that generate a superset of languages generated by CFG. We describe the scheme by developing a CockeKasamiYounger (CKY)like pure bottomup recognition algorithm for Linear Indexed Grammars and show how it can be adapted to give algorithms for Tree Adjoining Grammars and Combinatory Categorial Grammars. This is the only polynomialtime recognition algorithm for Combinatory Categorial Grammars that we are aware of
EdgeBased BestFirst Chart Parsing
 IN PROCEEDINGS OF THE SIXTH WORKSHOP ON VERY LARGE CORPORA
, 1998
"... Bestfirst probabilistic chart parsing attempts to parse efficiently by working on edges that are judged 'best' by some probabilistic figure of merit (FOM). Recent work has used proba bilistic contextfree grammars (PCFGs) to sign probabilities to constituents, and to use these probabilities as the ..."
Abstract

Cited by 53 (4 self)
 Add to MetaCart
Bestfirst probabilistic chart parsing attempts to parse efficiently by working on edges that are judged 'best' by some probabilistic figure of merit (FOM). Recent work has used proba bilistic contextfree grammars (PCFGs) to sign probabilities to constituents, and to use these probabilities as the starting point for the FOM. This paper extends this approach to us ing a probabilistic FOM to judge edges (incomplete constituents), thereby giving a much finergrained control over parsing effort. We show how this can be accomplished in a particularly simple way using the common idea of binarizing the PCFG. The results obtained are about a factor of twenty improvement over the best prior results  that is, our parser achieves equivalent results using one twentieth the number of edges. Furthermore we show that this improvement is obtained with parsing precision and recall levels superior to those achieved by exhaustive parsing.
Recognition can be Harder than Parsing
 Computational Intelligence
, 1992
"... this paper is to discuss the scope and limitations of this approach, and to examine the suitability of several syntactic formalisms on the criterion of their ability to handle it. 2 Parsing as intersection ..."
Abstract

Cited by 39 (0 self)
 Add to MetaCart
this paper is to discuss the scope and limitations of this approach, and to examine the suitability of several syntactic formalisms on the criterion of their ability to handle it. 2 Parsing as intersection
Generalized LeftCorner Parsing
 In Sixth Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference
, 1993
"... We show how techniques known from generalized LR parsing can be applied to leftcorner parsing. The esulting parsing algorithm for contextfree grammars has some advantages over generalized LR parsing: the sizes and generation times of the parsers are smaller, the produced output is more compa ..."
Abstract

Cited by 23 (7 self)
 Add to MetaCart
We show how techniques known from generalized LR parsing can be applied to leftcorner parsing. The esulting parsing algorithm for contextfree grammars has some advantages over generalized LR parsing: the sizes and generation times of the parsers are smaller, the produced output is more compact, and the basic parsing technique can more easily be adapted to arbitrary contextfree grammars.
Variations on Incremental Interpretation
 Journal of Psycholinguistic Research
, 1993
"... The strict competence hypothesis has sparked a small dialogue among several researchers attempting to understand its ramifications for human sentence processing and incremental interpretation in particular. In this paper, we review the dialogue, reconstructing the arguments in an attempt to make the ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
The strict competence hypothesis has sparked a small dialogue among several researchers attempting to understand its ramifications for human sentence processing and incremental interpretation in particular. In this paper, we review the dialogue, reconstructing the arguments in an attempt to make them more uniform and crisper, and provide our own analyses of certain of the issues that arise. We argue that strict competence, because it requires a synchronous computation mechanism, may actually lead to more complex, rather than simpler, models of incremental interpretation. Asynchronous computation, which is arguably both psychologically more plausible and conceptually more basic, allows for incremental interpretation to fall out naturally, without additional machinery for interpreting partial constituents. We show that this is true regardless of whether the presumed interpretation mechanism is topdown or bottomup, contra previous conclusions in the literature, and propose a particular i...
Synchronous Models of Language
 IN PROCEEDINGS OF THE 34TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL
, 1996
"... In synchronous rewriting, the productions of two rewriting systems are paired and applied synchronously in the derivation of a pair of strings. We present a new synchronous rewriting system and argue that it can handle certain phenomena that are not covered by existing synchronous systems. ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
In synchronous rewriting, the productions of two rewriting systems are paired and applied synchronously in the derivation of a pair of strings. We present a new synchronous rewriting system and argue that it can handle certain phenomena that are not covered by existing synchronous systems.
Memoization in TopDown Parsing
 Computational Linguistics
, 1995
"... this paper is to discover why this is the case and present a functional formalization of memoized topdown parsing for which this is not so. Specifically, I show how to formulate topdown parsers in a 'continuationpassing style,' which incrementally enumerates the right string positions of a catego ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
this paper is to discover why this is the case and present a functional formalization of memoized topdown parsing for which this is not so. Specifically, I show how to formulate topdown parsers in a 'continuationpassing style,' which incrementally enumerates the right string positions of a category, rather than returning a set of such positions as a single value. This permits a type of memoization not described to my knowledge in the context of functional programming before. This kind of memoization is akin to that used in logic programming, and yields terminating parsers even in the face of left recursion
An LALR Extension for DCGs in Dynamic Programming
 Mathematical Linguistics
, 1997
"... We propose a parsing model for natural languages based on the concept of definite clause grammar. Our work embodies in a common frame a dynamic programming construction developed for logical pushdown automata, and techniques that restrict the computation to a useful part of the search space inspire ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We propose a parsing model for natural languages based on the concept of definite clause grammar. Our work embodies in a common frame a dynamic programming construction developed for logical pushdown automata, and techniques that restrict the computation to a useful part of the search space inspired by LALR parsing. Unlike preceding approaches, our proposal avoids backtracking in all cases, providing computational sharing and operational completeness for definite clause grammars without functional symbols. Introduction The popularity of definite clause grammars (DCGs) is often related to natural language processing. In comparison with other formalisms, they seem to be particularly wellsuited to control the perspicuity with which linguistic phenomena may be understood and expressed in actual language descriptions. However, 2 Manuel Vilares Ferro and Miguel A. Alonso Pardo descriptive adequation does not guarantee operational efficiency, and computational tractability is required if...
Another Facet of LIG Parsing
, 1996
"... : In this paper we present a new parsing algorithm for linear indexed grammars (LIGs) in the same spirit as the one described in (VijayShanker and Weir, 1993) for tree adjoining grammars. For a LIG L and an input string x of length n, we build a non ambiguous contextfree grammar whose sentences ar ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
: In this paper we present a new parsing algorithm for linear indexed grammars (LIGs) in the same spirit as the one described in (VijayShanker and Weir, 1993) for tree adjoining grammars. For a LIG L and an input string x of length n, we build a non ambiguous contextfree grammar whose sentences are all (and exclusively) valid derivation sequences in L which lead to x. We show that this grammar can be built in O(n 6 ) time and that individual parses can be extracted in linear time with the size of the extracted parse tree. Though this O(n 6 ) upper bound does not improve over previous results, the average case behaves much better. Moreover, practical parsing times can be decreased by some statically performed computations. Keywords: mildly contextsensitive parsing, ambiguity, parse tree, shared parse forest. (R#sum# : tsvp) This report is an extended version of the ACL'96 paper (Boullier, 1996). * Email: Pierre.Boullier@inria.fr Unit de recherche INRIA Rocquencourt Domaine de ...
LPDA: Another look at Tabulation in Logic Programming
 Proceedings of the International Conference on Logic Programming
, 1994
"... The Logic PushDown Automaton (LPDA) is introduced as an abstract operational model for the evaluation of logic programs. The LPDA can be used to describe a significant number of evaluation strategies, ranging from the topdown OLD strategy to bottomup strategies, with or without prediction. Two ty ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The Logic PushDown Automaton (LPDA) is introduced as an abstract operational model for the evaluation of logic programs. The LPDA can be used to describe a significant number of evaluation strategies, ranging from the topdown OLD strategy to bottomup strategies, with or without prediction. Two types of dynamic programming, i.e. tabular, interpretation are defined, one being more efficient but restricted to a subclass of LPDAs. We propose to evaluate a logic program by first compiling it into a LPDA according to some chosen evaluation strategy, and then applying a tabular interpreter to this LPDA. This approach offers great flexibility and generalizes Magic Set transformations. It explains in a more intuitive way some known Magic Set variants and their limits, and also suggests new developments. Keywords: logic programs, tabulation, memoing, magicset, dynamic programming, pushdown automata. 1 Introduction The recent years have seen the popularity of (at least) two approaches to i...