Results 1 
8 of
8
A Descriptive Approach to LanguageTheoretic Complexity
, 1996
"... Contents 1 Language Complexity in Generative Grammar 3 Part I The Descriptive Complexity of Strongly ContextFree Languages 11 2 Introduction to Part I 13 3 Trees as Elementary Structures 15 4 L 2 K;P and SnS 25 5 Definability and NonDefinability in L 2 K;P 35 6 Conclusion of Part I 57 DRAFT ..."
Abstract

Cited by 58 (3 self)
 Add to MetaCart
Contents 1 Language Complexity in Generative Grammar 3 Part I The Descriptive Complexity of Strongly ContextFree Languages 11 2 Introduction to Part I 13 3 Trees as Elementary Structures 15 4 L 2 K;P and SnS 25 5 Definability and NonDefinability in L 2 K;P 35 6 Conclusion of Part I 57 DRAFT 2 / Contents Part II The Generative Capacity of GB Theories 59 7 Introduction to Part II 61 8 The Fundamental Structures of GB Theories 69 9 GB and Nondefinability in L 2 K;P 79 10 Formalizing XBar Theory 93 11 The Lexicon, Subcategorization, Thetatheory, and Case Theory 111 12 Binding and Control 119 13 Chains 131 14 Reconstruction 157 15 Limitations of the Interpretation 173 16 Conclusion of Part II 179 A Index of Definitions 183 Bibliography DRAFT 1<
A Computational Analysis of Information Structure Using Parallel Expository Texts in English and Japanese
, 1999
"... I would like to thank my advisor, Mark Steedman, for his valuable advice, warm encouragement, and unfailing patience throughout my years at Penn. His extremely broad interests and deep insight have always guided my work on this thesis, especially at difficult times. Among many things I learned from ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
I would like to thank my advisor, Mark Steedman, for his valuable advice, warm encouragement, and unfailing patience throughout my years at Penn. His extremely broad interests and deep insight have always guided my work on this thesis, especially at difficult times. Among many things I learned from Mark, I particularly appreciate his points that I needed to deliver results visibly and demonstrate the generality of an idea. I am deeply grateful to my thesis committee members. Claire Gardent made critical comments that were difficult to respond to but gave me an opportunity to look at the thesis from a different point of view. Aravind Joshi inspired me on a wide range of issues, from formal grammars to discourse analysis. Martha Palmer read a draft of this thesis with great care and made numerous helpful points about the contents. Bonnie Webber gave me an opportunity to assist in her enjoyable AI course for three semesters. Although Ellen Prince is not on the thesis committee, I thank her for serving on the proposal committee and teaching me linguistic pragmatics. The superb academic environment at Penn is one of the factors I cannot forget. I learned a great many basic concepts from the coursework in computer science and linguistics. I thank my
Generative Power of CCGs with Generalized TypeRaised Categories
, 1997
"... This paper shows that a class of Combinatory Categorial Grammars (CCGs) augmented with a linguisticallymotivated form of type raising involving variables is weakly equivalent to the standard CCGs not involving variables. The proof is based on the idea that any instance of such a granunar can ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
This paper shows that a class of Combinatory Categorial Grammars (CCGs) augmented with a linguisticallymotivated form of type raising involving variables is weakly equivalent to the standard CCGs not involving variables. The proof is based on the idea that any instance of such a granunar can be simulated by a standard CCG.
Bayesian Learning in Nonlinear StateSpace Models
"... We describe Bayesian learning in nonlinear statespace models (NSSMs). NSSMs are a general method for the probabilistic modelling of sequences and timeseries. They take the form of iterated maps on continuous statespaces, and can have either discrete or continuous valued output functions. Th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We describe Bayesian learning in nonlinear statespace models (NSSMs). NSSMs are a general method for the probabilistic modelling of sequences and timeseries. They take the form of iterated maps on continuous statespaces, and can have either discrete or continuous valued output functions. They are generalizations of the more well known statespace models such as Hidden Markov models (HMMs), and LinearGaussian statespace models (LSSMs). In this paper, we describe the problems of Bayesian learning and inference and in NSSMs. We present an MCMC methods of sampling from the posterior of the parameters given observed data.
Language Learning and Nonlinear Dynamical Systems
, 2003
"... stems can be mapped directly to any member of a large class of lowdimensional chaotic dynamical systems. The importance of this is that for a given chaotic dynamical system to be a model of a given language, we may set up a target probability distribution over its statespace, such that if it visit ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
stems can be mapped directly to any member of a large class of lowdimensional chaotic dynamical systems. The importance of this is that for a given chaotic dynamical system to be a model of a given language, we may set up a target probability distribution over its statespace, such that if it visits its statespace according to this distribution it will generate the language that is desired. A specific dynamical system is chosen as a model for learning. This is known as the 2d tentmap. We derive a learning algorithm for this particular dynamical system. This algorithm is based on a matrix approximation of the FrobeniusPerron operator. Examples of learning regular, contextfree and contextsensitive languages are provided. BIOGRAPHICAL SKETCH Mark Andrews was born in Ireland in 1973. In 1995, he graduated with a BA from the National University of Ireland. The same year, he began graduate study in Cornell Univeristy in Ithaca, New York. In 1998, he received a M.Sc from Cornell. In
Stochastic ContextFree Grammars
, 2004
"... Introduction: Strings, Grammars & Formal Languages Stochastic contextfree grammars, or SCFGs, are generative systems of stochastic languages. In other words, a SCFG specifies a probability distribution over the set of all possible strings that are concatenated from a finite alphabet . Defin ..."
Abstract
 Add to MetaCart
Introduction: Strings, Grammars & Formal Languages Stochastic contextfree grammars, or SCFGs, are generative systems of stochastic languages. In other words, a SCFG specifies a probability distribution over the set of all possible strings that are concatenated from a finite alphabet . Definition 1 An alphabet is a finite set of symbols. It denoted here by }. Definition 2 The kth power of an alphabet , denoted , is the set of strings of length k made from the elements of . Example 1 For the alphabet 1}, where V = 2. will be set of strings, made from the elements of , that are of length 1, or 1}. Likewise, 01, 10, 11}, {000, 001, 010, 011, 100, 101, 110, 111}, etc. Definition 3 # , or Kleene closure of , is the union of all powers of the alphabet, or # = . . . . Definition 4 A formal language L is defined as L # . In other words, L is a (possibly infinite) set of strings, each formed by concatenation from a finite set of