Results 1 - 10
of
40
Transforming linear context-free rewriting systems into minimalist grammars
"... Abstract. The type of a minimalist grammar (MG) as introduced by Stabler [11, 12] provides an attempt of a rigorous algebraic formalization of the new perspectives adopted within the linguistic framework of transformational grammar due to the change from GB–theory to minimalism. Michaelis [6] has sh ..."
Abstract
-
Cited by 25 (6 self)
- Add to MetaCart
(Show Context)
Abstract. The type of a minimalist grammar (MG) as introduced by Stabler [11, 12] provides an attempt of a rigorous algebraic formalization of the new perspectives adopted within the linguistic framework of transformational grammar due to the change from GB–theory to minimalism. Michaelis [6] has shown that MGs constitute a subclass of mildly context–sensitive grammars in the sense that for each MG there is a weakly equivalent linear context–free rewriting system (LCFRS). However, it has been left open in [6], whether the respective classes of string languages derivable by MGs and LCFRSs coincide. This paper completes the picture by showing that MGs in the sense of [11] and LCFRSs in fact determine the same class of derivable string languages. 1
Probabilistic Models of Word Order and Syntactic Discontinuity
, 2005
"... Copyright by Roger Levy 2005 ii ..."
Observations on Strict Derivational Minimalism
- ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE
, 2001
"... Deviating from the denition originally presented in [12], Stabler [13] introduced inspired by some recent proposals in terms of a minimalist approach to transformational syntaxa (revised) type of a minimalist grammar (MG) as well as a certain type of a strict minimalist grammar (SMG). These two type ..."
Abstract
-
Cited by 15 (5 self)
- Add to MetaCart
(Show Context)
Deviating from the denition originally presented in [12], Stabler [13] introduced inspired by some recent proposals in terms of a minimalist approach to transformational syntaxa (revised) type of a minimalist grammar (MG) as well as a certain type of a strict minimalist grammar (SMG). These two types can be shown to determine the same class of derivable string languages.
Contrasting applications of logic in natural language syntactic description
- Logic, Methodology and Philosophy of Science: Proceedings of the Twelfth International Congress
, 2005
"... Abstract. Formal syntax has hitherto worked mostly with theoretical frameworks that take grammars to be generative, in Emil Post’s sense: they provide recursive enumerations of sets. This work has its origins in Post’s formalization of proof theory. There is an alternative, with roots in the semanti ..."
Abstract
-
Cited by 15 (1 self)
- Add to MetaCart
Abstract. Formal syntax has hitherto worked mostly with theoretical frameworks that take grammars to be generative, in Emil Post’s sense: they provide recursive enumerations of sets. This work has its origins in Post’s formalization of proof theory. There is an alternative, with roots in the semantic side of logic: model-theoretic syntax (MTS). MTS takes grammars to be sets of statements of which (algebraically idealized) well-formed expressions are models. We clarify the difference between the two kinds of framework and review their separate histories, and then argue that the generative perspective has misled linguists concerning the properties of natural languages. We select two elementary facts about natural language phenomena for discussion: the gradient character of the property of being ungrammatical and the open nature of natural language lexicons. We claim that the MTS perspective on syntactic structure does much better on representing the facts in these two domains. We also examine the arguments linguists give for the infinitude of the class of all expressions in a natural language. These arguments turn out on examination to be either unsound or lacking in empirical content. We claim that infinitude is an unsupportable claim that is also unimportant. What is actually needed is a way of representing the structure of expressions in a natural language without assigning any importance to the notion of a unique set with definite cardinality that contains all and only the expressions in the language. MTS provides that.
Structural similarity within and among languages
- Theoretical Computer Science
"... Linguists rely on intuitive conceptions of structure when comparing expressions and languages. In an algebraic presentation of a language, some natural notions of sim-ilarity can be rigorously dened (e.g. among elements of a language, equivalence w.r.t. isomorphisms of the language; and among langua ..."
Abstract
-
Cited by 14 (1 self)
- Add to MetaCart
(Show Context)
Linguists rely on intuitive conceptions of structure when comparing expressions and languages. In an algebraic presentation of a language, some natural notions of sim-ilarity can be rigorously dened (e.g. among elements of a language, equivalence w.r.t. isomorphisms of the language; and among languages, equivalence w.r.t. iso-morphisms of symmetry groups), but it turns out that slightly more complex and non-standard notions are needed to capture the kinds of comparisons linguists want to make. This paper identies some of the important notions of structural similar-ity, with attention to similarity claims that are prominent in the current linguistic tradition of transformational grammar.
Learning rigid lambek grammars and minimalist grammars from structured sentences
- Third workshop on Learning Language in Logic, Strasbourg
, 2001
"... Abstract. We present an extension of Buszkowski’s learning algorithm for categorial grammars to rigid Lambek grammars and then for minimalist categorial grammars. The Kanazawa proof of the convergence in the Gold sense is simplified and extended to these new algorithms. We thus show that this techni ..."
Abstract
-
Cited by 10 (1 self)
- Add to MetaCart
(Show Context)
Abstract. We present an extension of Buszkowski’s learning algorithm for categorial grammars to rigid Lambek grammars and then for minimalist categorial grammars. The Kanazawa proof of the convergence in the Gold sense is simplified and extended to these new algorithms. We thus show that this technique based on principal type algorithm and type unification is quite general and applies to learning issues for different type logical grammars, which are larger, linguistically more accurate and closer to semantics. 1
A Note on Complexity of Constraint Interaction: Locality Conditions and Minimalist Grammars
, 2005
"... Locality Conditions (LCs) on (unbounded) dependencies have played a major role in the development of generative syntax ever since the seminal work by Ross [22]. Descriptively, they fall into two groups. On the one hand there are intervention-based LCs (ILCs) often formulated as “minimality constra ..."
Abstract
-
Cited by 9 (4 self)
- Add to MetaCart
Locality Conditions (LCs) on (unbounded) dependencies have played a major role in the development of generative syntax ever since the seminal work by Ross [22]. Descriptively, they fall into two groups. On the one hand there are intervention-based LCs (ILCs) often formulated as “minimality constraints” (“minimal link condition,” “minimize chain links,”“shortest move,” “attract closest,” etc.). On the other hand there are containment-based LCs (CLCs) typically defined in terms of (generalized) grammatical functions (“adjunct island,” “subject island,” “specifier island,” etc.). Research on LCs has been dominated by two very general trends. First, attempts have been made at unifying ILCs and CLCs on the basis of notions such as “government ” and “barrier ” (e.g. [4]). Secondly, research has often been guided by the intuition that, beyond empirical coverage, LCs somehow contribute to restricting the formal capacity of grammars (cf. [3, p. 125], [6, p. 14f]). Both these issues, we are going to argue, can be fruitfully studied within the framework of minimalist
Varieties of crossing dependencies: Structure dependence and mild context sensitivity
- Cognitive Science
, 2004
"... Four different kinds of grammars that can define crossing dependencies in human language are compared here: (i) context sensitive rewrite grammars with rules that depend on context; (ii) matching grammars with constraints that filter the generative structure of the language, (iii) copying grammars w ..."
Abstract
-
Cited by 7 (2 self)
- Add to MetaCart
(Show Context)
Four different kinds of grammars that can define crossing dependencies in human language are compared here: (i) context sensitive rewrite grammars with rules that depend on context; (ii) matching grammars with constraints that filter the generative structure of the language, (iii) copying grammars which can copy structures of unbounded size, and (iv) generating grammars in which crossing dependencies are generated from a finite lexical basis. Context sensitive rewrite grammars are syntactically, semantically and computationally unattractive. Generating grammars have a collection of nice properties that ensure they define only “mildly context sensitive” languages, and Joshi has proposed that human languages have those properties too. But for certain distinctive kinds of crossing dependencies in human languages, copying or matching analyses predominate. Some results relevant to the viability of mildly context sensitive analyses and some open questions are reviewed.
A prefix-correct Earley recognizer for multiple context-free grammars
- in "9th International Workshop on Tree Adjoining Grammars and Related Formalisms (TAG+9
, 2008
"... We present a method for deriving an Earley recognizer for multiple context-free grammars with the correct prefix property. This is done by representing an MCFG by a Datalog program and applying generalized supplementary magic-sets rewriting. To secure the correct prefix property, a simple extra rewr ..."
Abstract
-
Cited by 6 (1 self)
- Add to MetaCart
(Show Context)
We present a method for deriving an Earley recognizer for multiple context-free grammars with the correct prefix property. This is done by representing an MCFG by a Datalog program and applying generalized supplementary magic-sets rewriting. To secure the correct prefix property, a simple extra rewriting must be performed before the magic-sets rewriting. The correctness of the method is easy to see, and a straightforward application of the method to tree-adjoining grammars yields a recognizer whose running time is O(n 6). 1 Deriving an Earley-style recognizer by magic-sets rewriting We use the following 2-MCFG generating RESP+ = { am 1 am 2 bn 1bn 2am 3 am 4 bn 3bn 4 | m, n ≥ 1} as our running example: 1 (1) S (x1y1x2y2): − P(x1, x2), Q(y1, y2). P(a1a2, a3a4).
The acceptability cline in VP ellipsis
, 2010
"... This paper lays the foundations for a processing model of relative acceptability levels in verb phrase ellipsis (VPE). In the proposed model, mismatching VPE examples are grammatical but less acceptable because they violate heuristic parsing strategies. This analysis is presented in a Minimalist G ..."
Abstract
-
Cited by 6 (3 self)
- Add to MetaCart
(Show Context)
This paper lays the foundations for a processing model of relative acceptability levels in verb phrase ellipsis (VPE). In the proposed model, mismatching VPE examples are grammatical but less acceptable because they violate heuristic parsing strategies. This analysis is presented in a Minimalist Grammar formalism that is compatible with standard parsing techniques. The overall proposal integrates computational assumptions about parsing with a psycholinguistic linking hypothesis. These parts work together with the syntactic analysis to derive novel predictions that are confirmed in a controlled experiment.