Results 1 
5 of
5
Structural Equations in Language Learning
 Proceedings LACL 2001, Springer Lecture Notes in Artifical Intelligence 2099
, 2001
"... In categorial systems with a fixed structural component, the learning problem comes down to finding the solution for a set of typeassignment equations. A hardwired structural component is problematic if one want to address issues of structural variation. Our starting point is a typelogical arc ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
In categorial systems with a fixed structural component, the learning problem comes down to finding the solution for a set of typeassignment equations. A hardwired structural component is problematic if one want to address issues of structural variation. Our starting point is a typelogical architecture with separate modules for the logical and the structural components of the computational system. The logical component expresses invariants of grammatical composition; the structural component captures variation in the realization of the correspondence between form and meaning. Learning in this setting involves finding the solution to both the typeassignment equations and the structural equations of the language at hand. We develop a view on these two subtasks which pictures learning as a process moving through a twostage cycle.
Resource logics and minimalist grammars
 Proceedings ESSLLIâ€™99 workshop (Special issue Language and Computation
, 2002
"... This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are lar ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are largely informal apart from a few research papers. The study of resource logics, on the other hand, is formal and stems naturally from a long logical tradition. So although there appear to be promising connections between these traditions, there is at this point a rather thin intersection between them. The papers in this workshop are consequently rather diverse, some addressing general similarities between the two traditions, and others concentrating on a thorough study of a particular point. Nevertheless they succeed in convincing us of the continuing interest of studying and developing the relationship between the minimalist program and resource logics. This introduction reviews some of the basic issues and prior literature. 1 The interest of a convergence What would be the interest of a convergence between resource logical investigations of
Semantic Bootstrapping of TypeLogical Grammar
 Journal of Logic, Language and Information
, 2002
"... A procedure is described which induces typelogical grammar lexicons from sentences annotated with skeletal terms of the simply typed lambda calculus. A generalized formulaeastypes correspondence is exploited to obtain all the typelogical proofs of the sample sentences from their lambda terms, and ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
A procedure is described which induces typelogical grammar lexicons from sentences annotated with skeletal terms of the simply typed lambda calculus. A generalized formulaeastypes correspondence is exploited to obtain all the typelogical proofs of the sample sentences from their lambda terms, and the resulting lexicons are then optimally unified, which e#ectively unifies the syntactic categories of words which have the same syntactic behavior evident in the induced structures. This e#ort extends the earlier induction of such lexicons for classical categorial grammar (Buszkowski and Penn, 1990) to at first the nonassociative Lambek calculus, and then to a large class of type logics enriched by modal operators and structural rules. The motivation for this approach is linguisticwe have implemented a theoretically operational procedure for semantic bootstrapping of natural language syntax, which is the first one in any setting with su#cient scope to meet the demands of descriptively adequate natural language grammars. One of the main points of the enterprise is that the syntactic and semantic categories operating in the language are learned, in direct opposition to more familiar grammar induction procedures which begin with a fixed set of categories and frequently partofspeech tagged data as well. This general approach could be used by linguists to learn something about lexical categories and to develop linguistically insightful complete grammars for large fragments.
Grounding As Learning
, 2003
"... This paper takes a first step toward bringing the tools of formal language theory to bear on this problem. In the first place, these tools easily reveal a number of grounding problems which are simply unsolvable with reasonable assumptions about the evidence available, and some problems that can be ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
This paper takes a first step toward bringing the tools of formal language theory to bear on this problem. In the first place, these tools easily reveal a number of grounding problems which are simply unsolvable with reasonable assumptions about the evidence available, and some problems that can be solved. In the second place, these tools provide a framework for exploring more sophisticated grounding strategies (Stabler et al., 2003). We explore here some preliminary ideas about how hypotheses about syntactic structure can interact with hypotheses about grounding in a fruitful way to provide a new perspective on the emergence of recursion in language. Simpler grounding methods look for some kind of correlation between the mere occurrence of particular basic generators and semantic elements, but richer hypotheses about relations among the generators themselves can provide valuable additional constraints on the problem
Learnability of TypeLogical Grammars
, 2001
"... A procedure for learning a lexical assignment together with a system of syntactic and semantic categories given a fixed typelogical grammar is briefly described. The logic underlying the grammar can be any cutfree decidable modally enriched extension of the Lambek calculus, but the correspondence ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
A procedure for learning a lexical assignment together with a system of syntactic and semantic categories given a fixed typelogical grammar is briefly described. The logic underlying the grammar can be any cutfree decidable modally enriched extension of the Lambek calculus, but the correspondence between syntactic and semantic categories must be constrained so that no infinite set of categories is ultimately used to generate the language. It is shown that under these conditions various linguistically valuable subsets of the range of the algorithm are classes identifiable in the limit from data consisting of sentences labeled by simply typed lambda calculus meaning terms in normal form. The entire range of the algorithm is shown to be not a learnable class, contrary to a mistaken result reported in a preliminary version of this paper. It is informally argued that, given the right type logic, the learnable classes of grammars include members which generate natural languages, and thus that natural languages are learnable in this way.