Results 1  10
of
11
Recognizing head movement
 In de Groote et al
"... Abstract. Previous studies have provided logical representations and eÆcient recognition algorithms for a simple kind of \minimalist grammars. " This paper extends these grammars with head movement (\incorporation") and aÆx hopping. The recognition algorithms are elaborated for these gra ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Previous studies have provided logical representations and eÆcient recognition algorithms for a simple kind of \minimalist grammars. " This paper extends these grammars with head movement (\incorporation") and aÆx hopping. The recognition algorithms are elaborated for these grammars, and logical perspectives are brie
y considered. Michaelis (1998) showed how the derivations of a simple kind of minimalist grammar (MG) (Stabler, 1997) correspond to derivations of exactly the same strings in a multiple context free grammar (MCFG) (Seki et al., 1991). MGs build structures by merging pairs of expressions, and simplifying single expressions by moving a subexpression in them. The basic idea behind the correspondence with MCFGs can be traced back to Pollard (1984) who noticed, in eect, that when a constituent is going to move, we should not regard its yield as included in the yield of the expression that contains it. Instead, the expression from which something will move is better regarded has having multiple yields, multiple components { the \moving " components have not yet reached their nal positions.
Proof nets and the complexity of processing centerembedded constructions
 Journal of Logic, Language and Information
, 1998
"... Abstract. This paper shows how proof nets can be used to formalize the notion of “incomplete dependency ” used in psycholinguistic theories of the unacceptability of centerembedded constructions. Such theories of human language processing can usually be restated in terms of geometrical constraints ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
Abstract. This paper shows how proof nets can be used to formalize the notion of “incomplete dependency ” used in psycholinguistic theories of the unacceptability of centerembedded constructions. Such theories of human language processing can usually be restated in terms of geometrical constraints on proof nets. The paper ends with a discussion of the relationship between these constraints and incremental semantic interpretation. 1.
Features and Agreement in Lambek Categorial Grammar
 In Proceedings of the Formal Grammar Workshop
, 1995
"... This paper contrasts two different linguistic models of agreement in natural language. The first is an account of agreement commonly given in complexfeature, "unificationbased" theories of Grammar (UBG), where agreeing categories impose constraints on some shared agreement value, and the ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
This paper contrasts two different linguistic models of agreement in natural language. The first is an account of agreement commonly given in complexfeature, "unificationbased" theories of Grammar (UBG), where agreeing categories impose constraints on some shared agreement value, and the construction is wellformed only if these constraints are satisfiable or mutually consistent. The second is a simple theory of agreement inspired by Lambek Categorial Grammar (LCG) presented in Bayer and Johnson (1995). In this theory agreement phenomena are modelled in terms of the requirement that arguments must be subsumed by, or logically imply, the corresponding argument specification of a predicate or functor category. We will see that this model, when embedded in the logic of LCG, makes a number of interesting and linguistically correct predictions that are not made by the UBG account. 2 Agreement in UBGs
SMOOTHING A PROBABILISTIC LEXICON VIA SYNTACTIC TRANSFORMATIONS
, 2001
"... Probabilistic parsing requires a lexicon that specifies each word’s syntactic preferences in terms of probabilities. To estimate these probabilities for words that were poorly observed during training, this thesis assumes the existence of arbitrarily powerful transformations (also known to linguis ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
Probabilistic parsing requires a lexicon that specifies each word’s syntactic preferences in terms of probabilities. To estimate these probabilities for words that were poorly observed during training, this thesis assumes the existence of arbitrarily powerful transformations (also known to linguists as lexical redundancy rules or metarules) that can add, delete, retype or reorder the argument and adjunct positions specified by a lexical entry. In a given language, some transformations apply frequently and others rarely. We describe how to estimate the rates of the transformations from a sample of lexical entries. More deeply, we learn which properties of a transformation increase or decrease its rate in the language. As a result, we can smooth the probabilities of lexical entries. Given enough direct evidence about a lexical entry’s probability, our Bayesian approach trusts the evidence; but when less evidence or no evidence is available, it relies more on the transformations’ rates to guess how often the entry will be derived from related entries. Abstractly, the proposed “transformation models ” are probability distributions that arise from graph random walks with a loglinear parameterization. A domain expert constructs the parameterized graph, and a vertex is likely according to whether random walks
Features as Resources in RLFG
 Proceedings of the LFG97 Conference
, 1997
"... This paper describes a new formalization of LexicalFunctional Grammar ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
This paper describes a new formalization of LexicalFunctional Grammar
Resource logics and minimalist grammars
 Proceedings ESSLLI’99 workshop (Special issue Language and Computation
, 2002
"... This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are lar ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are largely informal apart from a few research papers. The study of resource logics, on the other hand, is formal and stems naturally from a long logical tradition. So although there appear to be promising connections between these traditions, there is at this point a rather thin intersection between them. The papers in this workshop are consequently rather diverse, some addressing general similarities between the two traditions, and others concentrating on a thorough study of a particular point. Nevertheless they succeed in convincing us of the continuing interest of studying and developing the relationship between the minimalist program and resource logics. This introduction reviews some of the basic issues and prior literature. 1 The interest of a convergence What would be the interest of a convergence between resource logical investigations of
Typedriven semantic interpretation and feature dependencies in RLFG
, 1998
"... This paper describes a new formalization of LexicalFunctional Grammar called RLFG (where the "R" stands for "Resourcebased"). The formal details of RLFG are presented in Johnson (1997); the present work concentrates on motivating RLFG and explaining to linguists how it diffe ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This paper describes a new formalization of LexicalFunctional Grammar called RLFG (where the "R" stands for "Resourcebased"). The formal details of RLFG are presented in Johnson (1997); the present work concentrates on motivating RLFG and explaining to linguists how it differs from the "classical" LFG framework presented in Kaplan and Bresnan (1982). This work is largely a reaction to the linear logic semantics for LFG developed by Dalrymple and colleagues (Dalrymple et al., 1995, 1996a,b,c). As explained below, it seems to me that their "glue language" approach bears a partial resemblance to those versions of Categorial Grammar which exploit the CurryHoward correspondence to obtain semantic intepretation (van Benthem, 1995), such as Lambek Categorial Grammar and its descendants. A primary goal of this work is to develop a version of LFG in which this connection is made explicit, and in which semantic interpretation falls out as a byproduct of the CurryHoward correspondence rather than needing to be stipulated via semantic interpretation rules. Once one has enriched LFG's formal machinery with the linear logic mechanisms needed for semantic interpretation, it is natural to ask whether these make any existing components of LFG redundant. As Dalrymple and her colleagues note, LFG's fstructure completeness and coherence constraints fall out as a byproduct of the linear logic machinery they propose for semantic interpretation, thus making those fstructure 2 Chapter 1. Typedriven semantic interpretation and feature dependencies in RLFG mechanisms redundant. Given that linear logic machinery or something like it is independently needed for semantic interpretation, it seems reasonable to explore the extent to which it is capable of handling feature structure con...
A ResourceSensitive Interpretation of Lexical Functional Grammar
 The Journal of Logic, Language and Information
, 1999
"... This paper investigates whether the fundamental linguistic insights and intuitions of Lexical Functional Grammar (LFG), which is usually presented as a "constraintbased" linguistic theory, can be reformulated in a "resource sensitive" framework using a substructural modal logic. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper investigates whether the fundamental linguistic insights and intuitions of Lexical Functional Grammar (LFG), which is usually presented as a "constraintbased" linguistic theory, can be reformulated in a "resource sensitive" framework using a substructural modal logic. In the approach investigated here, LFG's fdescriptions are replaced with expressions from a multimodal propositional logic (with permutation and possibly limited contraction). In effect, the feature structure "unification" basis of LFG's fstructures is replaced with a very different resource based mechanism. It turns out that some linguistic analyses that required nonmonotonic devices in LFG (such as the "constraint equations" in the Andrews (1982) analysis of Icelandic) can be straightforwardly expressed in the framework presented here. Moreover, a CurryHoward correspondence between proofs in this logic and terms provides a semantic interpretation as a byproduct of the process of showing syntactic we...
Features qs Resources in RLFG
 Parallel Processing in Computational Stochastic
, 2002
"... This paper describes a new formalization of LexicalFunctional Grammar called RLFG (where the "R " stands for "Resourcebased"). The formal details of RLFG are presented in Johnson (1997); the present work concentrates on motivating RLFG and explaining to lingu ..."
Abstract
 Add to MetaCart
This paper describes a new formalization of LexicalFunctional Grammar called RLFG (where the &quot;R &quot; stands for &quot;Resourcebased&quot;). The formal details of RLFG are presented in Johnson (1997); the present work concentrates on motivating RLFG and explaining to linguists how it differs from