Results 1  10
of
133
Formal grammar and information theory: Together again?
 PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY
, 2000
"... In the last 40 years, research on models of spoken and written language has been split between two seemingly irreconcilable traditions: formal linguistics in the Chomsky tradition, and information theory in the Shannon tradition. Zellig Harris had advocated a close alliance between grammatical and i ..."
Abstract

Cited by 35 (0 self)
 Add to MetaCart
(Show Context)
In the last 40 years, research on models of spoken and written language has been split between two seemingly irreconcilable traditions: formal linguistics in the Chomsky tradition, and information theory in the Shannon tradition. Zellig Harris had advocated a close alliance between grammatical and informationtheoretic principles in the analysis of natural language, and early formallanguage theory provided another strong link between information theory and linguistics. Nevertheless, in most research on language and computation, grammatical and informationtheoretic approaches had moved far apart. Today, after many years on the defensive, the informationtheoretic approach has gained new strength and achieved practical successes in speech recognition, information retrieval, and, increasingly, in language analysis and machine translation. The exponential increase in the speed and storage capacity of computers is the proximate cause of these engineering successes, allowing the automatic estimation of the parameters of probabilistic models of language by counting occurrences of linguistic events in very large bodies of text and speech. However, I will argue that informationtheoretic and computational ideas are also playing an increasing role in the scientific understanding of language, and will help bring together formallinguistic and informationtheoretic perspectives.
Remnant Movement and Complexity
, 1998
"... this paper both Lex and Far finite. Given a gr ammar that is for malized in this way, we will consider 2 Keenan and Stabler (1996) define languages this way in or]6 to consider for example, whatr4:j6 ons a r pr]j24" by automor phisms of L that fix F,justasas in semantics we can consider t ..."
Abstract

Cited by 33 (8 self)
 Add to MetaCart
this paper both Lex and Far finite. Given a gr ammar that is for malized in this way, we will consider 2 Keenan and Stabler (1996) define languages this way in or]6 to consider for example, whatr4:j6 ons a r pr]j24" by automor phisms of L that fix F,justasas in semantics we can consider the similar question: what is pr eser ved by automor  phisms of E that fix E,(2, #) . But in this paper , languages ar e defined as closur es justfor simplicity. 302 / Edward Stabler . What is the expr]j0 ve power of gr]:0HN that der ve the ver al complexes? (what sets ar definable) . What is the str:064" complexity of thever]H complexes in these gre mar: . Since str uctur e building is dr iven by lexical featur es, is ther e a usefulrful4:6[6j4 on ofder vations as gr]]H of featur checking r4H2H ons? Exprj:[ ve power is familiar but expr[06 on complexity isper haps not so familiar , so we quicklyr eview the ideas we will use. What is the size of a sentence like the following? every student criticized some teacher When compar ng sentences, one possible measur is simply the count of the charj4"H0 the length of the sequence. In this case, we have 37 char4"] (counting the spaces between wor4]2 To get a slightly moruniver04 measur0 it is common to consider how many binar choices a rr:N2 r ed to specify each element of the sequence. In the ASCII coding scheme, eachchar acter is specified by 7binar y choices, 7 bits. So in the ASCII coding scheme, the sequence isrj:4"]HH with 359 bits. 3 When we have a gr[HH0 that includes the sentence, we make available another way of specifying the sentence. The sentence can be specified by specifying its shor4[j der vation. Let's see how this wor s. Suppose we have the followinggr][0H G =#Lex,F#,with 8 lexical items consisting of str in...
Recognizing head movement
 In de Groote et al
"... Abstract. Previous studies have provided logical representations and eÆcient recognition algorithms for a simple kind of \minimalist grammars. " This paper extends these grammars with head movement (\incorporation") and aÆx hopping. The recognition algorithms are elaborated for these gra ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Previous studies have provided logical representations and eÆcient recognition algorithms for a simple kind of \minimalist grammars. " This paper extends these grammars with head movement (\incorporation") and aÆx hopping. The recognition algorithms are elaborated for these grammars, and logical perspectives are brie
y considered. Michaelis (1998) showed how the derivations of a simple kind of minimalist grammar (MG) (Stabler, 1997) correspond to derivations of exactly the same strings in a multiple context free grammar (MCFG) (Seki et al., 1991). MGs build structures by merging pairs of expressions, and simplifying single expressions by moving a subexpression in them. The basic idea behind the correspondence with MCFGs can be traced back to Pollard (1984) who noticed, in eect, that when a constituent is going to move, we should not regard its yield as included in the yield of the expression that contains it. Instead, the expression from which something will move is better regarded has having multiple yields, multiple components { the \moving " components have not yet reached their nal positions.
Remnant Movement and Structural Complexity
 CONSTRAINTS AND RESOURCES IN NATURAL LANGUAGE, STUDIES IN LOGIC, LANGUAGE AND INFORMATION. CSLI
, 1998
"... In some recent efforts to reduce the theoretical machinery of transformational syntax, all structures have the underlyingly order "specifierheadcomplement", all movement is leftward, featuredriven, phrasal, and overt. With these developments, the movement of constituents from which m ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
In some recent efforts to reduce the theoretical machinery of transformational syntax, all structures have the underlyingly order "specifierheadcomplement", all movement is leftward, featuredriven, phrasal, and overt. With these developments, the movement of constituents from which material has already been extracted, "remnant movement," is increasingly common. This paper shows that these restricted transformational frameworks remain very expressive in terms of their generative power, and that although the structures they define look complex and require new parsing strategies, their coding complexity is no higher than that of traditional analyses. Furthermore, the derivations of these structures have a simplicity which is revealed by representing them as graphs of matching pairs (feature checking relations), as is done in the "proof nets" of the type logical tradition.
Observations on Strict Derivational Minimalism
 ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE
, 2001
"... Deviating from the denition originally presented in [12], Stabler [13] introduced inspired by some recent proposals in terms of a minimalist approach to transformational syntaxa (revised) type of a minimalist grammar (MG) as well as a certain type of a strict minimalist grammar (SMG). These two type ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
Deviating from the denition originally presented in [12], Stabler [13] introduced inspired by some recent proposals in terms of a minimalist approach to transformational syntaxa (revised) type of a minimalist grammar (MG) as well as a certain type of a strict minimalist grammar (SMG). These two types can be shown to determine the same class of derivable string languages.
Minimalist Grammars and Recognition
"... Recent work has shown how basic ideas of the minimalist tradition intransformational syntax can be captured in a simple generative formalism, a"der:"AII] al minimalism." Thisfr]]I[L" can model "rel nant movement" analyses, which yield mor e complex antecedenttr ace r e ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
Recent work has shown how basic ideas of the minimalist tradition intransformational syntax can be captured in a simple generative formalism, a"der:"AII] al minimalism." Thisfr]]I[L" can model "rel nant movement" analyses, which yield mor e complex antecedenttr ace r elations, suggesting a new and significant sense in which linguistic str ctur esar "chain based." Michaelis (1998) showed that these grI] mar cor" spond to a certain kind of linear context free r write system, and this paper takes the next step of adapting ther ecognition methods for "nonconcatenative"grOO[]" (Weir 1988; Seki et al., 1991; Boullier , 1999). This tur ns out to be quite straightforward once the grammars are set out appropriately.
S.: An automata theoretic approach to minimalism
 Proceedings of the Workshop ModelTheoretic Syntax at 10; ESSLLI 2007
, 2007
"... VijayShanker et al. (1987) note that many interesting linguistic formalisms can be thought of as having essentially contextfree structure, but operating over objects richer than simple strings (sequences of strings, trees, or graphs). They introduce linear contextfree rewriting systems (LCFRS’s ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
(Show Context)
VijayShanker et al. (1987) note that many interesting linguistic formalisms can be thought of as having essentially contextfree structure, but operating over objects richer than simple strings (sequences of strings, trees, or graphs). They introduce linear contextfree rewriting systems (LCFRS’s, see also Weir (1988)) as a unifying framework for superficially different such formalisms (like (multi component) tree adjoining grammars, head grammars, and categorial grammars). Later work (Michaelis, 1998) has added minimalist grammars (MGs, see (Stabler, 1997)) to this list. Recently, Fülöp et al. (2004) have introduced multiple bottomup tree transducers (mbutt), which can be thought of as offering a transductive perspective on LCFRSs. The transductive
Learning mirror theory
, 2002
"... Mirror Theory is a syntactic framework developed in (Brody, 1997), where it is offered as a consequence of eliminating purported redundancies in Chomsky’s minimalism (Chomsky, 1995). A fundamental feature of Mirror Theory is its requirement that the syntactic headcomplement relation mirror certain ..."
Abstract

Cited by 13 (10 self)
 Add to MetaCart
Mirror Theory is a syntactic framework developed in (Brody, 1997), where it is offered as a consequence of eliminating purported redundancies in Chomsky’s minimalism (Chomsky, 1995). A fundamental feature of Mirror Theory is its requirement that the syntactic headcomplement relation mirror certain morphological relations (such as constituency). This requirement constrains the types of syntactic structures that can express a given phrase; the morphological constituency of the phrase determines part of the syntactic constituency, thereby ruling out other, weakly equivalent, alternatives. A less fundamental, but superficially very noticeable feature is the elimination of phrasal projection. Thus the Xbar structure on the left becomes the mirror theoretic structure on the right:
On Minimalist Attribute Grammars and Macro Tree Transducers
 Linguistic Form and its Computation
"... In this paper we extend the work by Michaelis (1999) which shows how to encode an arbitrary Minimalist Grammar in the sense of Stabler (1997) into a weakly equivalent multiple contextfree grammar (MCFG). By viewing MCFGrules as terms in a free Lawvere theory we can translate a given MCFG into a ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
In this paper we extend the work by Michaelis (1999) which shows how to encode an arbitrary Minimalist Grammar in the sense of Stabler (1997) into a weakly equivalent multiple contextfree grammar (MCFG). By viewing MCFGrules as terms in a free Lawvere theory we can translate a given MCFG into a regular tree grammar. The latter is characterizable by both a tree automaton and a corresponding formula in monadic secondorder (MSO) logic. The trees of the resulting regular tree language are then unpacked into the intended \linguistic" trees both through an MSO transduction based upon treewalking automata and through a macro tree transduction. This twostep approach gives an operational as well as a logical description of the tree sets involved. As an interlude we show that MCFGs can be regarded as a particularly simple attribute grammar. 1 Introduction Algebraic, logical and regular characterizations of (tree) languages provide a natural framework for the denotational and opera...