Results 1  10
of
70
Inducing Probabilistic CCG Grammars from Logical Form with HigherOrder Unification
"... This paper addresses the problem of learning to map sentences to logical form, given training data consisting of natural language sentences paired with logical representations of their meaning. Previous approaches have been designed for particular natural languages or specific meaning representation ..."
Abstract

Cited by 89 (17 self)
 Add to MetaCart
This paper addresses the problem of learning to map sentences to logical form, given training data consisting of natural language sentences paired with logical representations of their meaning. Previous approaches have been designed for particular natural languages or specific meaning representations; here we present a more general method. The approach induces a probabilistic CCG grammar that represents the meaning of individual words and defines how these meanings can be combined to analyze complete sentences. We use higherorder unification to define a hypothesis space containing all grammars consistent with the training data, and develop an online learning algorithm that efficiently searches this space while simultaneously estimating the parameters of a loglinear parsing model. Experiments demonstrate high accuracy on benchmark data sets in four languages with two different meaning representations. 1
The Acquisition of a UnificationBased Generalised Categorial Grammar
, 2002
"... The purpose of this work is to investigate the process of grammatical acquisition from data. In order to do that, a computational learning system is used, composed of a Universal Grammar with associated parameters, and a learning algorithm, following the Principles and Parameters Theory. The Univers ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
The purpose of this work is to investigate the process of grammatical acquisition from data. In order to do that, a computational learning system is used, composed of a Universal Grammar with associated parameters, and a learning algorithm, following the Principles and Parameters Theory. The Universal Grammar is implemented as a UnificationBased Generalised Categorial Grammar, embedded in a default inheritance network of lexical types. The learning algorithm receives input from a corpus of spontaneous childdirected transcribed speech annotated with logical forms and sets the parameters based on this input. This framework is used as a basis to investigate several aspects of language acquisition. In this thesis I concentrate on the acquisition of subcategorisation frames and word order information, from data. The data to which the learner is exposed can be noisy and ambiguous, and I investigate how these factors affect the learning process. The results obtained show a robust learner converging towards the target grammar given the input data available. They also show how the amount of noise present in the input data affects the speed of convergence of the learner towards the target grammar. Future work is suggested for investigating the developmental stages of language acquisition as predicted by the learning model, with a thorough comparison with the developmental stages of a child. This is primarily a cognitive computational model of language learning that can be used to investigate and gain a better understanding of human language acquisition, and can potentially be relevant to the development of more adaptive NLP technology.
Structural Equations in Language Learning
 Proceedings LACL 2001, Springer Lecture Notes in Artifical Intelligence 2099
, 2001
"... In categorial systems with a fixed structural component, the learning problem comes down to finding the solution for a set of typeassignment equations. A hardwired structural component is problematic if one want to address issues of structural variation. Our starting point is a typelogical arc ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
In categorial systems with a fixed structural component, the learning problem comes down to finding the solution for a set of typeassignment equations. A hardwired structural component is problematic if one want to address issues of structural variation. Our starting point is a typelogical architecture with separate modules for the logical and the structural components of the computational system. The logical component expresses invariants of grammatical composition; the structural component captures variation in the realization of the correspondence between form and meaning. Learning in this setting involves finding the solution to both the typeassignment equations and the structural equations of the language at hand. We develop a view on these two subtasks which pictures learning as a process moving through a twostage cycle.
Meaning Helps Learning Syntax
 in [ICGI 98
, 1998
"... . In this paper, we propose a new framework for the computational learning of formal grammars with positive data. In this model, both syntactic and semantic information are taken into account, which seems cognitively relevant for the modeling of natural language learning. The syntactic formalism use ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
(Show Context)
. In this paper, we propose a new framework for the computational learning of formal grammars with positive data. In this model, both syntactic and semantic information are taken into account, which seems cognitively relevant for the modeling of natural language learning. The syntactic formalism used is the one of Lambek categorial grammars and meaning is represented with logical formulas. The principle of compositionality is admitted and defined as an isomorphism applying to trees and allowing to automatically translate sentences into their semantic representation(s). Simple simulations of a learning algorithm are extensively developed and discussed. 1 Introduction Natural language learning seems, from a formal point of view, an enigma. As a matter of fact, every human being, given nearly exclusively positive examples ([25]), is able at the age of about five to master his/her mother tongue. Though every natural language has at least the power of contextfree grammars ([22]), this cla...
Semantic Bootstrapping of TypeLogical Grammar
 Journal of Logic, Language and Information
, 2002
"... A procedure is described which induces typelogical grammar lexicons from sentences annotated with skeletal terms of the simply typed lambda calculus. A generalized formulaeastypes correspondence is exploited to obtain all the typelogical proofs of the sample sentences from their lambda terms, and ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
A procedure is described which induces typelogical grammar lexicons from sentences annotated with skeletal terms of the simply typed lambda calculus. A generalized formulaeastypes correspondence is exploited to obtain all the typelogical proofs of the sample sentences from their lambda terms, and the resulting lexicons are then optimally unified, which e#ectively unifies the syntactic categories of words which have the same syntactic behavior evident in the induced structures. This e#ort extends the earlier induction of such lexicons for classical categorial grammar (Buszkowski and Penn, 1990) to at first the nonassociative Lambek calculus, and then to a large class of type logics enriched by modal operators and structural rules. The motivation for this approach is linguisticwe have implemented a theoretically operational procedure for semantic bootstrapping of natural language syntax, which is the first one in any setting with su#cient scope to meet the demands of descriptively adequate natural language grammars. One of the main points of the enterprise is that the syntactic and semantic categories operating in the language are learned, in direct opposition to more familiar grammar induction procedures which begin with a fixed set of categories and frequently partofspeech tagged data as well. This general approach could be used by linguists to learn something about lexical categories and to develop linguistically insightful complete grammars for large fragments.
Learning rigid lambek grammars and minimalist grammars from structured sentences
 Third workshop on Learning Language in Logic, Strasbourg
, 2001
"... Abstract. We present an extension of Buszkowski’s learning algorithm for categorial grammars to rigid Lambek grammars and then for minimalist categorial grammars. The Kanazawa proof of the convergence in the Gold sense is simplified and extended to these new algorithms. We thus show that this techni ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We present an extension of Buszkowski’s learning algorithm for categorial grammars to rigid Lambek grammars and then for minimalist categorial grammars. The Kanazawa proof of the convergence in the Gold sense is simplified and extended to these new algorithms. We thus show that this technique based on principal type algorithm and type unification is quite general and applies to learning issues for different type logical grammars, which are larger, linguistically more accurate and closer to semantics. 1
Towards a Semanticbased Theory of Language Learning
 In 12th Amsterdam Colloquium
, 1999
"... The notion of Structural Example has recently emerged in the domain of grammatical inference. It allows to solve the old difficult problem of learning a grammar from positive examples but seems to be a very had hoc structure for this purpose. In this article, we first propose a formal version of the ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
The notion of Structural Example has recently emerged in the domain of grammatical inference. It allows to solve the old difficult problem of learning a grammar from positive examples but seems to be a very had hoc structure for this purpose. In this article, we first propose a formal version of the Principle of Compositionality based on Structural Examples. We then give a sufficient condition under which the Structural Examples used in grammatical inference can be inferred from sentences and their semantic representations, which are supposed to be naturally available in the environment of children learning their mother tongue. Structural Examples thus appear as an interesting intermediate representation between syntax and semantics. This leads us to a new formal model of language learning where semantic information play a crucial role. 1. Introduction The problem of grammatical inference from positive examples consists in the design of algorithms able to identify a formal grammar fr...
Consistent Identification in the Limit of Rigid Grammars from Strings is NPhard
 Strings Is NPhard », ICGI’02, LNAI 2484
, 2002
"... In [Bus87] and [BP90] some `discovery procedures' for classical categorial grammars were defined. These procedures take a set of structures (strings labeled with derivational information) as input and yield a set of hypotheses in the form of grammars. ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
In [Bus87] and [BP90] some `discovery procedures' for classical categorial grammars were defined. These procedures take a set of structures (strings labeled with derivational information) as input and yield a set of hypotheses in the form of grammars.
Resource logics and minimalist grammars
 Proceedings ESSLLI’99 workshop (Special issue Language and Computation
, 2002
"... This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are lar ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are largely informal apart from a few research papers. The study of resource logics, on the other hand, is formal and stems naturally from a long logical tradition. So although there appear to be promising connections between these traditions, there is at this point a rather thin intersection between them. The papers in this workshop are consequently rather diverse, some addressing general similarities between the two traditions, and others concentrating on a thorough study of a particular point. Nevertheless they succeed in convincing us of the continuing interest of studying and developing the relationship between the minimalist program and resource logics. This introduction reviews some of the basic issues and prior literature. 1 The interest of a convergence What would be the interest of a convergence between resource logical investigations of