Results 1  10
of
14
The Evolution of ModelTheoretic Frameworks in Linguistics
"... The varieties of mathematical basis for formalizing linguistic theories are more diverse than is commonly realized. For example, the later work of Zellig Harris might well suggest a formalization in terms of CATE ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The varieties of mathematical basis for formalizing linguistic theories are more diverse than is commonly realized. For example, the later work of Zellig Harris might well suggest a formalization in terms of CATE
Elementary Principles of HPSG
, 2001
"... This chapter describes the theoretical foundations and descriptive mechanisms of HeadDriven Phrase Structure Grammar (HPSG), as well as proposed treatments for a number of familiar grammatical phenomena. The anticipated reader has some familiarity with syntactic phenomena and the function of a th ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This chapter describes the theoretical foundations and descriptive mechanisms of HeadDriven Phrase Structure Grammar (HPSG), as well as proposed treatments for a number of familiar grammatical phenomena. The anticipated reader has some familiarity with syntactic phenomena and the function of a theory of syntax, but not necessarily any expertise with modern theories of phrasestructure grammar. The goal of this chapter is not so much to provide a tutorial in some consistent (and inevitably dated) version of HPSG as it is to explicate the philosophy and techniques of HPSG grammars. This is de nitely not \HPSG for Dummies" (which is how I thought of it before I read a \Dummies" book). "Dummies" explanations have their place (I couldn't have learned how to use Adobe PhotoShop without one), but the purpose of this chapter is to explain the foundations and techniques of HPSG accounts of grammatical phenomena so that readers can access the technical literature, rather than to
A Formal Proof of Strong Equivalence for a Grammar Conversion from LTAG to HPSGstyle
 In roceedings of the sixth International Workshop on Tree Adjoining Grammars and Related Frameworks (TAG+6
, 2002
"... This paper presents a sketch of a formal proof of strong equivalence, where both grammars generate equivalent parse results, between any LTAG (Lexicalized Tree Adjoining Grammar: Schabes, Abeille and Joshi (1988)) G and an HPSG (HeadDriven Phrase Structure Grammar: Pollard and Sag (1994))style gra ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper presents a sketch of a formal proof of strong equivalence, where both grammars generate equivalent parse results, between any LTAG (Lexicalized Tree Adjoining Grammar: Schabes, Abeille and Joshi (1988)) G and an HPSG (HeadDriven Phrase Structure Grammar: Pollard and Sag (1994))style grammar converted from G by a grammar conversion (Yoshinaga and Miyao, 2001). Our proof theoretically justifies some applications of the grammar conversion that exploit the nature of strong equivalence (Yoshinaga et al., 2001b; Yoshinaga et al., 2001a), applications which contribute much to the developments of the two formalisms
2002. Comparative economy conditions in natural language syntax. Paper presented at the North
 Stanford University
"... ..."
To appear in Language
"... In early transformational grammar, lexical items were essentially lifeless, pushed around by powerful and diverse phrase structure rules and transformations. The balance of power began to shift, though, in the late 1970s. The phrase structure rules became more generalized (Jackendoff 1977; Gazdar et ..."
Abstract
 Add to MetaCart
In early transformational grammar, lexical items were essentially lifeless, pushed around by powerful and diverse phrase structure rules and transformations. The balance of power began to shift, though, in the late 1970s. The phrase structure rules became more generalized (Jackendoff 1977; Gazdar et al. 1985; Kaplan &
Chapter 3 Applications of Modal Logic in Model Theoretic Syntax 1
"... Since extending the logic of strings to capture more complex string languages than the regular languages often leads to undecidability (see e.g. Lautemann et al. (1995)), one approach to extending the coverage of logic is to describe more complex structures: move from strings to trees. Thus, the Kri ..."
Abstract
 Add to MetaCart
Since extending the logic of strings to capture more complex string languages than the regular languages often leads to undecidability (see e.g. Lautemann et al. (1995)), one approach to extending the coverage of logic is to describe more complex structures: move from strings to trees. Thus, the Kripke structures we will be considering are trees, and the logics will contain more complicated modalities to describe trees. One immediate advantage of this approach for linguistic purposes is that these logics will automatically be connected to strong generative capacity, since they describe sets of trees. One disadvantage is that the recognition or parsing problem, which in the string case just amounts to model checking, now involves satisfiability checking (see below). The extension of the descriptive approach to trees was originally also motivated by decidability questions Thatcher and Wright (1968). Even though the connections to CFLs were pointed out by Thatcher (1967), this line of research did not find applications in linguistics until the development of constraint based grammar formalisms which replaced the derivational approach to natural language syntax. The work of Rogers (1998), Kracht (2003), and others provided formal models for these constraint based grammar formalisms and established formal language theoretic results for them at the same time. As mentioned above our Kripke structures will now be trees. We will use the concept of tree domains Gorn (1967) to define such Kripke structures. A (finite, binary) tree domain, T, is a finite subset of {0,1} ∗ , such that for all u,v ∈ {0,1} ∗ 1. if uv ∈ T, then u ∈ T, and 2. if u1 ∈ T, then u0 ∈ T. A string in T describes a path from the root to a node, where 0 means “go left ” and 1 means “go right”. We identify nodes with the path leading to them. Thus, ε is the root. The first condition above says that if there is a path to a node, then there is a path to any node above it (this is called prefix closure). The second condition says that if a node has a right daughter, then it has a left daughter (called left sibling closure).