Results 1  10
of
23
A Descriptive Approach to LanguageTheoretic Complexity
, 1996
"... Contents 1 Language Complexity in Generative Grammar 3 Part I The Descriptive Complexity of Strongly ContextFree Languages 11 2 Introduction to Part I 13 3 Trees as Elementary Structures 15 4 L 2 K;P and SnS 25 5 Definability and NonDefinability in L 2 K;P 35 6 Conclusion of Part I 57 DRAFT ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
Contents 1 Language Complexity in Generative Grammar 3 Part I The Descriptive Complexity of Strongly ContextFree Languages 11 2 Introduction to Part I 13 3 Trees as Elementary Structures 15 4 L 2 K;P and SnS 25 5 Definability and NonDefinability in L 2 K;P 35 6 Conclusion of Part I 57 DRAFT 2 / Contents Part II The Generative Capacity of GB Theories 59 7 Introduction to Part II 61 8 The Fundamental Structures of GB Theories 69 9 GB and Nondefinability in L 2 K;P 79 10 Formalizing XBar Theory 93 11 The Lexicon, Subcategorization, Thetatheory, and Case Theory 111 12 Binding and Control 119 13 Chains 131 14 Reconstruction 157 15 Limitations of the Interpretation 173 16 Conclusion of Part II 179 A Index of Definitions 183 Bibliography DRAFT 1<
Talking About Trees
 In Proceedings of the 6th Conference of the European Chapter of the Association for Computational Linguistics
, 1993
"... In this paper we introduce a modal lan guage L T for imposing constraints on trees, and an extension LT(L r) for imposing con straints on trees decorated with feature structures. The motivation for introducing these languages is to provide tools for formalising grammatical frameworks perspic ..."
Abstract

Cited by 45 (3 self)
 Add to MetaCart
In this paper we introduce a modal lan guage L T for imposing constraints on trees, and an extension LT(L r) for imposing con straints on trees decorated with feature structures. The motivation for introducing these languages is to provide tools for formalising grammatical frameworks perspicuously, and the paper illustrates this by showing how the leading ideas of arsa can be captured in LT(Lr).
Lexical Rules in the Hierarchical Lexicon
, 1987
"... this dissertation. I single out for special thanks first a few of the Ventura Hall crowd, including Mfirvet Eng, Nancy Wiegand, Susan Stucky (the other Mennonite formal linguist), and Kathie Carpenter, Suzanne Kemmer and Michael Barlow, with whom I have happily shared every step of the Stanford grad ..."
Abstract

Cited by 38 (2 self)
 Add to MetaCart
this dissertation. I single out for special thanks first a few of the Ventura Hall crowd, including Mfirvet Eng, Nancy Wiegand, Susan Stucky (the other Mennonite formal linguist), and Kathie Carpenter, Suzanne Kemmer and Michael Barlow, with whom I have happily shared every step of the Stanford graduate pilgrimage. Next, I warmly thank Gina Wein for her competent administrative support and for her friendship. Finally, I gratefully acknowledge the strong shaping influences of the members of the Stanford linguistics faculty, who teach and also model a vibrant and professional approach to linguistic research. Representative of these scholars are the three members of my reading committee, whose work and counsel have had a profound effect on my work; I thank Joan Bresnan, Ivan Sag, and my principal advisor, Thomas Wasow, whose patience, cheerful persistence, unstinting support, solid critique, creative ideas, and common sense made the writing of this thesis possible and enjoyable. Every student should have such an advisor
Linguistic optimization
"... Optimality Theory (OT) is a model of language that combines aspects of generative and connectionist linguistics. It is unique in the field in its use of a rank ordering on constraints, which is used to formalize optimization, the choice of the best of a set of potential linguistic forms. We show tha ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
Optimality Theory (OT) is a model of language that combines aspects of generative and connectionist linguistics. It is unique in the field in its use of a rank ordering on constraints, which is used to formalize optimization, the choice of the best of a set of potential linguistic forms. We show that phenomena argued to require ranking fall out equally from the form of optimization in OT’s predecessor Harmonic Grammar (HG), which uses numerical weights to encode the relative strength of constraints. We further argue that the known problems for HG can be resolved by adopting assumptions about the nature of constraints that have precedents both in OT and elsewhere in computational and generative linguistics. This leads to a formal proof that if the range of each constraint is a bounded number of violations, HG generates a finite number of languages. This is nontrivial, since the set of possible weights for each constraint is nondenumerably infinite. We also briefly review some advantages of HG. 1
Contrasting applications of logic in natural language syntactic description
 Logic, Methodology and Philosophy of Science: Proceedings of the Twelfth International Congress
, 2005
"... Abstract. Formal syntax has hitherto worked mostly with theoretical frameworks that take grammars to be generative, in Emil Post’s sense: they provide recursive enumerations of sets. This work has its origins in Post’s formalization of proof theory. There is an alternative, with roots in the semanti ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Abstract. Formal syntax has hitherto worked mostly with theoretical frameworks that take grammars to be generative, in Emil Post’s sense: they provide recursive enumerations of sets. This work has its origins in Post’s formalization of proof theory. There is an alternative, with roots in the semantic side of logic: modeltheoretic syntax (MTS). MTS takes grammars to be sets of statements of which (algebraically idealized) wellformed expressions are models. We clarify the difference between the two kinds of framework and review their separate histories, and then argue that the generative perspective has misled linguists concerning the properties of natural languages. We select two elementary facts about natural language phenomena for discussion: the gradient character of the property of being ungrammatical and the open nature of natural language lexicons. We claim that the MTS perspective on syntactic structure does much better on representing the facts in these two domains. We also examine the arguments linguists give for the infinitude of the class of all expressions in a natural language. These arguments turn out on examination to be either unsound or lacking in empirical content. We claim that infinitude is an unsupportable claim that is also unimportant. What is actually needed is a way of representing the structure of expressions in a natural language without assigning any importance to the notion of a unique set with definite cardinality that contains all and only the expressions in the language. MTS provides that.
Model Theoretic Syntax
 The Glot International State of the Article Book 1, Studies in Generative Grammar 48, Mouton de Gruyter
, 1998
"... this article appeared in Glot, the main issue agitating researchers in model theoretic syntax was the problem of the contextfree barrier. We have seen that the hierarchy of logics collapses, when applied to trees, at the border of the tree languages strongly generated by context free (string) gramm ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
this article appeared in Glot, the main issue agitating researchers in model theoretic syntax was the problem of the contextfree barrier. We have seen that the hierarchy of logics collapses, when applied to trees, at the border of the tree languages strongly generated by context free (string) grammars, in the sense that distinctions between the different tree logics reduce to apparently superficial distinctions in how much memory allocation is hidden in the logic. The problem which researchers set themselves was not just breaking the context free barrier but remaining decidable in the process. This is a very difficult problem, and it must be admitted right off that it is somewhat artificial in that there is no a priori reason to suppose that natural languages can be described in a decidable logic. The arguments on either side are something like the following. First, the rather slight increases in computational complexity required to get the "mildly context sensitive" languages do suggest that this might be possible. The hunch here would be that the qualities that characterize the mildly context sensitive languages (polynomial parsability, constant growth property) as being like the contextfree languages are going to turn out to be reflections of decidability. The problems must not be underestimated, however! It is well known that the monadic second order logic of trees is one of the most powerful decidable logics known. It seems unlikely that any primitive relations can be added to the repertoire of tree description primitives that we have already seen, without making the logic undecidable. Many attempts have been made within logic and all have failed. So it is equally tempting to conjecture that the contextfree boundary coincides in some deep sense with the bounda...
Phrase Structure Trees Bear More Fruit than You Would Have Thought
, 1982
"... this paper we will present several results concerning phrase structure trees. These results show that phrase structure trees, when viewed in certain ways, have much more descriptive power than one would have thought. We have given a brief account of local constraints on structural descriptions an ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
this paper we will present several results concerning phrase structure trees. These results show that phrase structure trees, when viewed in certain ways, have much more descriptive power than one would have thought. We have given a brief account of local constraints on structural descriptions and an intuitive proof of a theorem about local constraints. We have compared the local constraints approach to some aspects of Gazdar's framework and that of Peters and Ritchie and of Karttunen. We have also presented some results on skeletons (phrase structure trees without labels) which show that phrase structure trees, even when deprived of the labels, retain in a certain sense all the structural information
Mathematical linguistics
, 2007
"... but in fact this is still an early draft, version 0.56, August 1 2001. Please do ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
but in fact this is still an early draft, version 0.56, August 1 2001. Please do
Relationship Between Strong And Weak Generative Power Of Formal Systems
 In Proceedings of the Fifth International Workshop on Tree Adjoining Grammars and Related Formalisms (TAG+5
, 2000
"... The relationship between strong and weak generative powers of formal systems is explored, in particular, from the point of view of squeezing more strong power out of a formal system without increasing its weak generative power. We examine a whole range of old and new results from this perspective. H ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The relationship between strong and weak generative powers of formal systems is explored, in particular, from the point of view of squeezing more strong power out of a formal system without increasing its weak generative power. We examine a whole range of old and new results from this perspective. However, the main goal of this paper is to investigate the strong generative power of Lambek categorial grammars in the context of crossing dependencies, in view of the recent work of Tiede (1998).