Results 1  10
of
72
Remnant Movement and Complexity
, 1998
"... this paper both Lex and Far finite. Given a gr ammar that is for malized in this way, we will consider 2 Keenan and Stabler (1996) define languages this way in or]6 to consider for example, whatr4:j6 ons a r pr]j24" by automor phisms of L that fix F,justasas in semantics we can consider the si ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
this paper both Lex and Far finite. Given a gr ammar that is for malized in this way, we will consider 2 Keenan and Stabler (1996) define languages this way in or]6 to consider for example, whatr4:j6 ons a r pr]j24" by automor phisms of L that fix F,justasas in semantics we can consider the similar question: what is pr eser ved by automor  phisms of E that fix E,(2, #) . But in this paper , languages ar e defined as closur es justfor simplicity. 302 / Edward Stabler . What is the expr]j0 ve power of gr]:0HN that der ve the ver al complexes? (what sets ar definable) . What is the str:064" complexity of thever]H complexes in these gre mar: . Since str uctur e building is dr iven by lexical featur es, is ther e a usefulrful4:6[6j4 on ofder vations as gr]]H of featur checking r4H2H ons? Exprj:[ ve power is familiar but expr[06 on complexity isper haps not so familiar , so we quicklyr eview the ideas we will use. What is the size of a sentence like the following? every student criticized some teacher When compar ng sentences, one possible measur is simply the count of the charj4"H0 the length of the sequence. In this case, we have 37 char4"] (counting the spaces between wor4]2 To get a slightly moruniver04 measur0 it is common to consider how many binar choices a rr:N2 r ed to specify each element of the sequence. In the ASCII coding scheme, eachchar acter is specified by 7binar y choices, 7 bits. So in the ASCII coding scheme, the sequence isrj:4"]HH with 359 bits. 3 When we have a gr[HH0 that includes the sentence, we make available another way of specifying the sentence. The sentence can be specified by specifying its shor4[j der vation. Let's see how this wor s. Suppose we have the followinggr][0H G =#Lex,F#,with 8 lexical items consisting of str in...
Formal grammar and information theory: Together again?
 PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY
, 2000
"... In the last 40 years, research on models of spoken and written language has been split between two seemingly irreconcilable traditions: formal linguistics in the Chomsky tradition, and information theory in the Shannon tradition. Zellig Harris had advocated a close alliance between grammatical and i ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
In the last 40 years, research on models of spoken and written language has been split between two seemingly irreconcilable traditions: formal linguistics in the Chomsky tradition, and information theory in the Shannon tradition. Zellig Harris had advocated a close alliance between grammatical and informationtheoretic principles in the analysis of natural language, and early formallanguage theory provided another strong link between information theory and linguistics. Nevertheless, in most research on language and computation, grammatical and informationtheoretic approaches had moved far apart. Today, after many years on the defensive, the informationtheoretic approach has gained new strength and achieved practical successes in speech recognition, information retrieval, and, increasingly, in language analysis and machine translation. The exponential increase in the speed and storage capacity of computers is the proximate cause of these engineering successes, allowing the automatic estimation of the parameters of probabilistic models of language by counting occurrences of linguistic events in very large bodies of text and speech. However, I will argue that informationtheoretic and computational ideas are also playing an increasing role in the scientific understanding of language, and will help bring together formallinguistic and informationtheoretic perspectives.
Remnant Movement and Structural Complexity
 CONSTRAINTS AND RESOURCES IN NATURAL LANGUAGE, STUDIES IN LOGIC, LANGUAGE AND INFORMATION. CSLI
, 1998
"... In some recent efforts to reduce the theoretical machinery of transformational syntax, all structures have the underlyingly order "specifierheadcomplement", all movement is leftward, featuredriven, phrasal, and overt. With these developments, the movement of constituents from which material ha ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
In some recent efforts to reduce the theoretical machinery of transformational syntax, all structures have the underlyingly order "specifierheadcomplement", all movement is leftward, featuredriven, phrasal, and overt. With these developments, the movement of constituents from which material has already been extracted, "remnant movement," is increasingly common. This paper shows that these restricted transformational frameworks remain very expressive in terms of their generative power, and that although the structures they define look complex and require new parsing strategies, their coding complexity is no higher than that of traditional analyses. Furthermore, the derivations of these structures have a simplicity which is revealed by representing them as graphs of matching pairs (feature checking relations), as is done in the "proof nets" of the type logical tradition.
On Minimalist Attribute Grammars and Macro Tree Transducers
 Linguistic Form and its Computation
"... In this paper we extend the work by Michaelis (1999) which shows how to encode an arbitrary Minimalist Grammar in the sense of Stabler (1997) into a weakly equivalent multiple contextfree grammar (MCFG). By viewing MCFGrules as terms in a free Lawvere theory we can translate a given MCFG into a ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
In this paper we extend the work by Michaelis (1999) which shows how to encode an arbitrary Minimalist Grammar in the sense of Stabler (1997) into a weakly equivalent multiple contextfree grammar (MCFG). By viewing MCFGrules as terms in a free Lawvere theory we can translate a given MCFG into a regular tree grammar. The latter is characterizable by both a tree automaton and a corresponding formula in monadic secondorder (MSO) logic. The trees of the resulting regular tree language are then unpacked into the intended \linguistic" trees both through an MSO transduction based upon treewalking automata and through a macro tree transduction. This twostep approach gives an operational as well as a logical description of the tree sets involved. As an interlude we show that MCFGs can be regarded as a particularly simple attribute grammar. 1 Introduction Algebraic, logical and regular characterizations of (tree) languages provide a natural framework for the denotational and opera...
Observations on Strict Derivational Minimalism
 ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE
, 2001
"... Deviating from the denition originally presented in [12], Stabler [13] introduced inspired by some recent proposals in terms of a minimalist approach to transformational syntaxa (revised) type of a minimalist grammar (MG) as well as a certain type of a strict minimalist grammar (SMG). These two type ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
Deviating from the denition originally presented in [12], Stabler [13] introduced inspired by some recent proposals in terms of a minimalist approach to transformational syntaxa (revised) type of a minimalist grammar (MG) as well as a certain type of a strict minimalist grammar (SMG). These two types can be shown to determine the same class of derivable string languages.
Constants of Grammatical Reasoning
 Constraints and Resources in Natural Language Syntax and Semantics
, 1999
"... This is a screen version, enhanced with some dynamic features, of the paper that has appeared under the same title in Bouma, Hinrichs, Kruij# & Oehrle (eds.) Constraints and Resources in Natural Language Syntax and Semantics. CSLI, Stanford, 1999. You can use the # and # keys to move through the ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
This is a screen version, enhanced with some dynamic features, of the paper that has appeared under the same title in Bouma, Hinrichs, Kruij# & Oehrle (eds.) Constraints and Resources in Natural Language Syntax and Semantics. CSLI, Stanford, 1999. You can use the # and # keys to move through the document. The # sign at the bottom of the screen brings you back from a hyperlink. Contents # Contents 1 Cognition = computation, grammar = logic . . . . . . . . . . . . . . . . . . 4 1.1 Grammatical resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.1.1 Composition: the form dimension . . . . . . . . . . . . . 12 1.1.2 Composition: the meaning dimension. . . . . . . . . . 16 1.1.3 Lexical versus derivational meaning. . . . . . . . . . . . 19 1.2 Grammatical reasoning: logic, structure and control . . . . . 20 2 Patterns for structural variation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 2.1 English relativization: right branch extraction . . . . . . . . . . 31 2.2 Dutch relativization: left branch extraction . . . . . . . . . . . . . 38 2.3 Dependency: blocking extraction from subjects . . . . . . . . . 43 3 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Contents # 1. Cognition = computation, grammar = logic Within current linguistic frameworks a rich variety of principles has been put forward to account for the properties of local and unbounded dependencies. Valency requirements of lexical items are checked by subcategorization principles in HPSG, principles of coherence and completeness in LFG, the theta criterion in GB. These are supplemented by, and interacting with, principles governing nonlocal dependencies: movement and empty category principles, slash featu...
Contrasting applications of logic in natural language syntactic description
 Logic, Methodology and Philosophy of Science: Proceedings of the Twelfth International Congress
, 2005
"... Abstract. Formal syntax has hitherto worked mostly with theoretical frameworks that take grammars to be generative, in Emil Post’s sense: they provide recursive enumerations of sets. This work has its origins in Post’s formalization of proof theory. There is an alternative, with roots in the semanti ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Abstract. Formal syntax has hitherto worked mostly with theoretical frameworks that take grammars to be generative, in Emil Post’s sense: they provide recursive enumerations of sets. This work has its origins in Post’s formalization of proof theory. There is an alternative, with roots in the semantic side of logic: modeltheoretic syntax (MTS). MTS takes grammars to be sets of statements of which (algebraically idealized) wellformed expressions are models. We clarify the difference between the two kinds of framework and review their separate histories, and then argue that the generative perspective has misled linguists concerning the properties of natural languages. We select two elementary facts about natural language phenomena for discussion: the gradient character of the property of being ungrammatical and the open nature of natural language lexicons. We claim that the MTS perspective on syntactic structure does much better on representing the facts in these two domains. We also examine the arguments linguists give for the infinitude of the class of all expressions in a natural language. These arguments turn out on examination to be either unsound or lacking in empirical content. We claim that infinitude is an unsupportable claim that is also unimportant. What is actually needed is a way of representing the structure of expressions in a natural language without assigning any importance to the notion of a unique set with definite cardinality that contains all and only the expressions in the language. MTS provides that.
Learning rigid lambek grammars and minimalist grammars from structured sentences
 Third workshop on Learning Language in Logic, Strasbourg
, 2001
"... Abstract. We present an extension of Buszkowski’s learning algorithm for categorial grammars to rigid Lambek grammars and then for minimalist categorial grammars. The Kanazawa proof of the convergence in the Gold sense is simplified and extended to these new algorithms. We thus show that this techni ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Abstract. We present an extension of Buszkowski’s learning algorithm for categorial grammars to rigid Lambek grammars and then for minimalist categorial grammars. The Kanazawa proof of the convergence in the Gold sense is simplified and extended to these new algorithms. We thus show that this technique based on principal type algorithm and type unification is quite general and applies to learning issues for different type logical grammars, which are larger, linguistically more accurate and closer to semantics. 1
Minimalist Grammars and Recognition
"... Recent work has shown how basic ideas of the minimalist tradition intransformational syntax can be captured in a simple generative formalism, a"der:"AII] al minimalism." Thisfr]]I[L" can model "rel nant movement" analyses, which yield mor e complex antecedenttr ace r elations, suggesting a new and ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Recent work has shown how basic ideas of the minimalist tradition intransformational syntax can be captured in a simple generative formalism, a"der:"AII] al minimalism." Thisfr]]I[L" can model "rel nant movement" analyses, which yield mor e complex antecedenttr ace r elations, suggesting a new and significant sense in which linguistic str ctur esar "chain based." Michaelis (1998) showed that these grI] mar cor" spond to a certain kind of linear context free r write system, and this paper takes the next step of adapting ther ecognition methods for "nonconcatenative"grOO[]" (Weir 1988; Seki et al., 1991; Boullier , 1999). This tur ns out to be quite straightforward once the grammars are set out appropriately.
Learning Mirror Theory
, 2002
"... this paper we will be focussing on reduced rigid mirror theoretic grammars (rrMTGs). This change of perspective serves to simplify discussion, and does not alter the class of languages to be learned ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
this paper we will be focussing on reduced rigid mirror theoretic grammars (rrMTGs). This change of perspective serves to simplify discussion, and does not alter the class of languages to be learned