Results 1  10
of
24
On the Logic and Learning of Language
, 2002
"... algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.1.1 Homomorphisms and free generators . . . . . . . . . . . . 34 3.1.2 Quotient algebras . . . . . . . . . . . . . . . . . . . . . . . 36 3.1.3 Reducts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.2 Algebras of la ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.1.1 Homomorphisms and free generators . . . . . . . . . . . . 34 3.1.2 Quotient algebras . . . . . . . . . . . . . . . . . . . . . . . 36 3.1.3 Reducts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.2 Algebras of languages . . . . . . . . . . . . . . . . . . . . . . . . . 38 3.2.1 The algebra of formulae . . . . . . . . . . . . . . . . . . . 38 3.2.2 Substitutions . . . . . . . . . . . . . . . . . . . . . . . . . . 39 3.2.3 Associated algebras . . . . . . . . . . . . . . . . . . . . . . 40 3.2.4 Valuations . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3.2.5 LindenbaumTarski quotient algebras . . . . . . . . . . . . 42 3.3 Algebras of deductive systems . . . . . . . . . . . . . . . . . . . . 44 3.3.1 Determining a class of algebras . . . . . . . . . . . . . . . 45 3.3.2 Algebra of a sequent calculus . . . . . . . . . . . . . . . . . 46 3.3.3 Completeness . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.4 Subsuming special cases: an example . . . . . . . . . . . . . . . . 49 3.4.1 The sequent system GL . . . . . . . . . . . . . . . . . . . . 49 3.4.2 The equivalent system t(GL) . . . . . . . . . . . . . . . . . 51 3.4.3 Algebraic models for GL . . . . . . . . . . . . . . . . . . . 52 3.5 Kripke semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 4 Categorial type logics 61 4.1 The typed lambda calculus . . . . . . . . . . . . . . . . . . . . . . 62 4.2 Categorial grammar . . . . . . . . . . . . . . . . . . . . . . . . . . 66 4.3 Forms of Lambek's calculus . . . . . . . . . . . . . . . . . . . . . . 69 4.3.1 Classical CG revisited . . . . . . . . . . . . . . . . . . . . . 70 4.3.2 The nonassociative productfree system . . . . . . . . . . . 70 4.3.3 Addin...
Resource logics and minimalist grammars
 Proceedings ESSLLI’99 workshop (Special issue Language and Computation
, 2002
"... This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are lar ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
This ESSLLI workshop is devoted to connecting the linguistic use of resource logics and categorial grammar to minimalist grammars and related generative grammars. Minimalist grammars are relatively recent, and although they stem from a long tradition of work in transformational grammar, they are largely informal apart from a few research papers. The study of resource logics, on the other hand, is formal and stems naturally from a long logical tradition. So although there appear to be promising connections between these traditions, there is at this point a rather thin intersection between them. The papers in this workshop are consequently rather diverse, some addressing general similarities between the two traditions, and others concentrating on a thorough study of a particular point. Nevertheless they succeed in convincing us of the continuing interest of studying and developing the relationship between the minimalist program and resource logics. This introduction reviews some of the basic issues and prior literature. 1 The interest of a convergence What would be the interest of a convergence between resource logical investigations of
Formal & computational aspects of dependency grammar : Heads, dependents, and dependency structures. http://www.coli.unisb.de/g̃j/Lectures/DG.ESSLLI/index.phtml
, 2002
"... ..."
(Show Context)
Computing Interpolants in Implicational Logics
"... I present a new syntactical method for proving the Interpolation Theorem for the implicational fragment of intuitionistic logic and its substructural subsystems. This method, like Prawitz’s, works on natural deductions rather than sequent derivations, and, unlike existing methods, always finds a ‘st ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
I present a new syntactical method for proving the Interpolation Theorem for the implicational fragment of intuitionistic logic and its substructural subsystems. This method, like Prawitz’s, works on natural deductions rather than sequent derivations, and, unlike existing methods, always finds a ‘strongest ’ interpolant under a certain restricted but reasonable notion of what counts as an ‘interpolant’.
Proof theory and formal grammars: applications of normalization
 In Benedikt Löwe, Wolfgang Malzkom, and Thoralf Räsch, editors, Foundations of the formal sciences II
, 2003
"... One of the main areas of interaction between logic and linguistics in the last 20 years has been the proof theoretical approach to formal grammars. This approach dates back to Lambek’s work in the 1950s. Lambek proposed to ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
One of the main areas of interaction between logic and linguistics in the last 20 years has been the proof theoretical approach to formal grammars. This approach dates back to Lambek’s work in the 1950s. Lambek proposed to
Computing word meanings by interpolation
 ILLC/Department of Philosophy, University of Amsterdam
, 2003
"... I outline a natural algorithm for solving a central problem in the task of learning wordtomeaning mappings, as formulated by Siskind (1996, 2000) and extended to the typed lambda calculus setting by Kanazawa (2001). The algorithm is based on a new syntactical method for proving the Interpolation T ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
I outline a natural algorithm for solving a central problem in the task of learning wordtomeaning mappings, as formulated by Siskind (1996, 2000) and extended to the typed lambda calculus setting by Kanazawa (2001). The algorithm is based on a new syntactical method for proving the Interpolation Theorem for the implicational fragment of intuitionistic propositional logic. A central problem in the task of learning wordtomeaning mappings, as formulated by Siskind (1996, 2000), can be illustrated by the following example. The learner knows from the outset that the meaning of each word is represented by some firstorder term with zero or more free variables, and the sentence meaning is composed from the meanings of the component words by performing a sequence of substitutions in a suitable order. Suppose that the learner has already inferred from evidence presented so far that the meaning of lifted, whatever it is, is built up from three symbols CAUSE, GO, and UP, together with some number of variables. Suppose that the learner is now given the information that the meaning of John lifted Mary is represented by the firstorder term CAUSE(John, GO(Mary, UP)). The available evidence suffices to uniquely pin down the meaning of lifted to CAUSE(x, GO(y, UP)), which can be computed by a simple algorithm, even if the meanings of John and Mary may still be indeterminate. Following Kanazawa (2001), I generalize this problem to the typed lambda calculus setting as follows. The meaning of each word is represented by a closed λIterm (with one or more constant symbols). The meaning of a sentence is obtained by plugging the meanings of the words in a suitable meaning recipe, represented by a linear (or BCI) λterm (containing one free variable for each of the words) of type t and then computing the βnormal form of the resulting λterm. The central problem now becomes: Mapping Problem. Given a closed λIterm N of type t in βnormal form containing constant symbols cA11,..., c
Computational Complexity in Natural Language
"... We have become so used to viewing natural language in computational terms that we need occasionally to remind ourselves of the methodological commitment this view entails. That commitment is this: we assume that to understand linguistic tasks—tasks such as recognizing sentences, determining their st ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We have become so used to viewing natural language in computational terms that we need occasionally to remind ourselves of the methodological commitment this view entails. That commitment is this: we assume that to understand linguistic tasks—tasks such as recognizing sentences, determining their structure, extracting their meaning, and manipulating the information they contain—is to discover the algorithms required to perform those tasks, and to investigate their computational properties. To be sure, the physical realization of the corresponding processes in humans is a legitimate study too, but one from which the computational investigation of language may be pursued in Splendid Isolation. Complexity Theory is the mathematical study of the resources—both in time and space—required to perform computational tasks. What bounds can we place—from above or below—on the number of steps taken to compute suchandsuch a function, or a function belonging to suchandsuch a class? What bounds can we place on the amount of memory required? It is not surprising, therefore, that in the study of natural language, complexitytheoretic issues abound. Since any computational task can be the object of complexitytheoretic investigation, it would be hopeless even to attempt a complete survey of Complexity Theory in the study of natural language. We focus therefore on a selection of topics in natural language where there has been a particular accumulation of complexitytheoretic results. Section 2 discusses parsing and recognition; Section 3 discusses the computation of logical form; and Section 4 discusses the problem of determining logical relationships between sentences in natural language. But we begin with a brief review of the Complexity Theory itself. A draft chapter for the Blackwell Computational Linguistics and Natural Language Processing Handbook, edited by Alex Clark, Chris Fox and Shalom Lappin.
Bounded and Ordered Satisfiability: Connecting Recognition with Lambekstyle Calculi to Classical Satisfiability Testing
"... this paper, when we mention the Lambek Calculus (LC) or Lambek Grammars (LG), we are referring to the productfree fragment ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
this paper, when we mention the Lambek Calculus (LC) or Lambek Grammars (LG), we are referring to the productfree fragment
Efficient Parsing with the ProductFree Lambek Calculus
"... This paper provides a parsing algorithm for the Lambek calculus which is polynomial time for a more general fragment of the Lambek calculus than any previously known algorithm. The algorithm runs in worstcase time O(n5) when restricted to a certain fragment of the Lambek calculus which is motivated ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
This paper provides a parsing algorithm for the Lambek calculus which is polynomial time for a more general fragment of the Lambek calculus than any previously known algorithm. The algorithm runs in worstcase time O(n5) when restricted to a certain fragment of the Lambek calculus which is motivated by empirical analysis. In addition, a set of parameterized inputs are given, showing why the algorithm has exponential worstcase running time for the Lambek calculus in general. 1
Relating Categorial Type Logics and CCG through simulation
, 2000
"... The primary focus of this paper is to establish an abstract fragment in \multimodal logical grammar" or Categorial Type Logics (CTL) [25] that simulates Combinatory Categorial Grammar (CCG) [38]. By showing that the simulation is weakly equivalent to the generative power of CCG, we can prove th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The primary focus of this paper is to establish an abstract fragment in \multimodal logical grammar" or Categorial Type Logics (CTL) [25] that simulates Combinatory Categorial Grammar (CCG) [38]. By showing that the simulation is weakly equivalent to the generative power of CCG, we can prove that the abstract fragment is mildlycontext sensitive and can be parsed in polynomial time. This is the rst time that such an observation has been established for a nontrivial fragment with generative power signicantly greater than that of the Lambek calculus L. Other potentially interesting observations are that we can, if so desired, give a logical interpretation for CCG (thus countering criticism in e.g. [26]), and that we can establish a more negrained crosslinguistic characterization of CCG's combinatory rules based on their formulation in the simulation. Keywords: Categorial Grammar, CCG, CTL, MMLG, mild contextsensitivity 1 Introduction The perceived position of natural language o...