Results 11  20
of
91
Relating complexity to practical performance in parsing with widecoverage unification grammars
, 1994
"... The paper demonstrates that exponential complexities with respect to grammar size and input length have little impact on the performance of three unificationbased parsing algorithms, using a widecoverage grammar. The results imply that tile study and optimisation of unificationbased parsing must ..."
Abstract

Cited by 32 (6 self)
 Add to MetaCart
The paper demonstrates that exponential complexities with respect to grammar size and input length have little impact on the performance of three unificationbased parsing algorithms, using a widecoverage grammar. The results imply that tile study and optimisation of unificationbased parsing must rely on empirical data until complexity theory can more accurately predict the practical behaviour of such parsers.
The information conveyed by words in sentences
 Journal of Psycholinguistic Research
, 2003
"... A method is presented for calculating the amount of information conveyed to a hearer by a speaker emitting a sentence generated by a probabilistic grammar known to both parties. The method applies the work of Grenander (1967) to the intermediate states of a topdown parser. This allows the uncertain ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
(Show Context)
A method is presented for calculating the amount of information conveyed to a hearer by a speaker emitting a sentence generated by a probabilistic grammar known to both parties. The method applies the work of Grenander (1967) to the intermediate states of a topdown parser. This allows the uncertainty about structural ambiguity to be calculated at each point in a sentence. Subtracting these values at successive points gives the information conveyed by a word in a sentence. Wordbyword information conveyed is calculated for several small probabilistic grammars, and it is suggested that the number of bits conveyed per word is a determinant of reading times and other measures of cognitive load. KEY WORDS: computational psycholinguistics; entropy reduction.
Parsing Incomplete Sentences
, 1988
"... An efficient contextfree parsing algorithln is preseuted that can parse sentences with unknown parts of unknown length. It produc in finite form all possible parses (often infinite in number) that could account for the missing parts. The algorithm is a variation on the construction due to Earl ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
An efficient contextfree parsing algorithln is preseuted that can parse sentences with unknown parts of unknown length. It produc in finite form all possible parses (often infinite in number) that could account for the missing parts. The algorithm is a variation on the construction due to Earley. ltowever, its presentation is such that it can readily be adapted to any chart parsing schema (top down, bottomup, etc...).
Using Filters for the Disambiguation of Contextfree Grammars
 Proc. ASMICS Workshop on Parsing Theory
, 1994
"... An ambiguous contextfree grammar defines a language in which some sentences have multiple interpretations. For conciseness, ambiguous contextfree grammars are frequently used to define even completely unambiguous languages and numerous disambiguation methods exist for specifying which interpretatio ..."
Abstract

Cited by 30 (10 self)
 Add to MetaCart
An ambiguous contextfree grammar defines a language in which some sentences have multiple interpretations. For conciseness, ambiguous contextfree grammars are frequently used to define even completely unambiguous languages and numerous disambiguation methods exist for specifying which interpretation is the intended one for each sentence. The existing methods can be divided in `parser specific' methods that describe how some parsing technique deals with ambiguous sentences and `logical' methods that describe the intended interpretation without reference to a specific parsing technique. We propose a framework of filters to describe and compare a wide range of disambiguation problems in a parserindependent way. A filter is a function that selects from a set of parse trees (the canonical representation of the interpretations of a sentence) the intended trees. The framework enables us to define several general properties of disambiguation methods. The expressive power of filters is illust...
Polynomial Time and Space ShiftReduce Parsing of Arbitrary Contextfree Grammars
, 1991
"... We introduce an algorithm for designing a predictive left to right shiftreduce nondeterministic pushdown machine corresponding to an arbitrary unrestricted contextfree grammar and an algorithm for efficiently driving this machine in pseudoparallel. The performance of the resulting parser is for ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
We introduce an algorithm for designing a predictive left to right shiftreduce nondeterministic pushdown machine corresponding to an arbitrary unrestricted contextfree grammar and an algorithm for efficiently driving this machine in pseudoparallel. The performance of the resulting parser is formally proven to be superior to Earley's parser (1970). The technique employed consists in constructing before runtime a parsing table that encodes a nondeterministic machine in the which the predictive behavior has been compiled out. At run time, the machine is driven in pseudoparallel with the help of a chart.
Generalized LeftCorner Parsing
 In Sixth Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference
, 1993
"... We show how techniques known from generalized LR parsing can be applied to leftcorner parsing. The esulting parsing algorithm for contextfree grammars has some advantages over generalized LR parsing: the sizes and generation times of the parsers are smaller, the produced output is more compa ..."
Abstract

Cited by 25 (8 self)
 Add to MetaCart
We show how techniques known from generalized LR parsing can be applied to leftcorner parsing. The esulting parsing algorithm for contextfree grammars has some advantages over generalized LR parsing: the sizes and generation times of the parsers are smaller, the produced output is more compact, and the basic parsing technique can more easily be adapted to arbitrary contextfree grammars.
How Understanding and Restructuring differ from Compiling  a Rewriting Perspective
, 2003
"... Syntactic and semantic analysis are established topics in the area of compiler construction. Their application to the understanding and restructuring of large software systems reveals, however, that they have various shortcomings that need to be addressed. In this paper, we study these shortcomings ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
Syntactic and semantic analysis are established topics in the area of compiler construction. Their application to the understanding and restructuring of large software systems reveals, however, that they have various shortcomings that need to be addressed. In this paper, we study these shortcomings and propose several solutions. First, grammar recovery and grammar composition are discussed as well as the symbiosis of lexical syntax and contextfree syntax. Next, it is shown how a relational calculus can be defined by way of term rewriting and how a fusion of term rewriting and this relational calculus can be obtained to provide semanticsdirected querying and restructuring. Finally, we discuss how the distance between concrete syntax and abstract syntax can be minimized for the benefit of restructuring. In particular, we pay attention to origin tracking, a systematic technique to maintain a mapping between the output and the input of the rewriting process. Along the way, opportunities for further research will be indicated.
How to Cover a Grammar
, 1989
"... this paper. On the one hand, nondeterministic LRparsing comes down to the use of certain covers for the grammar at hand, just like the Earley algorithm. Reversely, we showed that the Earley cover can, with minor modifications, be obtained from the LL/LItautomaton, which also uses precom puted se ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
this paper. On the one hand, nondeterministic LRparsing comes down to the use of certain covers for the grammar at hand, just like the Earley algorithm. Reversely, we showed that the Earley cover can, with minor modifications, be obtained from the LL/LItautomaton, which also uses precom puted sets of items
The intersection of Finite State Automata and Definite Clause Grammars
, 1995
"... Bernard Lang defines parsing as the calculation of the intersection of a FSA (the input) and a CFG. Viewing the input for parsing as a FSA rather than as a string combines well with some approaches in speech understanding systems, in which parsing takes a word lattice as input (rather than a word st ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
(Show Context)
Bernard Lang defines parsing as the calculation of the intersection of a FSA (the input) and a CFG. Viewing the input for parsing as a FSA rather than as a string combines well with some approaches in speech understanding systems, in which parsing takes a word lattice as input (rather than a word string). Furthermore, certain techniques for robust parsing can be modelled as finite state transducers.
Term Rewriting for Sale
, 1998
"... Term rewriting has a large potential for industrial applications, but these applications are always larger than one could ever dream of: huge sets of rewrite rules and gigantic terms to rewrite pose interesting challenges for implementors and theoreticians alike. We give a brief overview of the gene ..."
Abstract

Cited by 21 (15 self)
 Add to MetaCart
Term rewriting has a large potential for industrial applications, but these applications are always larger than one could ever dream of: huge sets of rewrite rules and gigantic terms to rewrite pose interesting challenges for implementors and theoreticians alike. We give a brief overview of the generation of termrewritingbased tools as done in the Asf+Sdf MetaEnvironment and then we sketch two major applications of term rewriting: transformation of legacy COBOL systems and compilation of Asf+Sdf to C. Based on these experiences we suggest the study of topics that could further advance the use of term rewriting in industrial applications: persistent term databases, generalized LR parsing versus parallel term rewriting, and coordination languages versus strategy languages. It will turn out that we have an "alien" view on research in term rewriting: properties like confluence and termination are of very limited use when selling term rewriting to industry.