Results 1  10
of
17
Logic Programming in a Fragment of Intuitionistic Linear Logic
"... When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ..."
Abstract

Cited by 320 (45 self)
 Add to MetaCart
(Show Context)
When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ∪ {D}. Thus during the bottomup search for a cutfree proof contexts, represented as the lefthand side of intuitionistic sequents, grow as stacks. While such an intuitionistic notion of context provides for elegant specifications of many computations, contexts can be made more expressive and flexible if they are based on linear logic. After presenting two equivalent formulations of a fragment of linear logic, we show that the fragment has a goaldirected interpretation, thereby partially justifying calling it a logic programming language. Logic programs based on the intuitionistic theory of hereditary Harrop formulas can be modularly embedded into this linear logic setting. Programming examples taken from theorem proving, natural language parsing, and data base programming are presented: each example requires a linear, rather than intuitionistic, notion of context to be modeled adequately. An interpreter for this logic programming language must address the problem of splitting contexts; that is, when attempting to prove a multiplicative conjunction (tensor), say G1 ⊗ G2, from the context ∆, the latter must be split into disjoint contexts ∆1 and ∆2 for which G1 follows from ∆1 and G2 follows from ∆2. Since there is an exponential number of such splits, it is important to delay the choice of a split as much as possible. A mechanism for the lazy splitting of contexts is presented based on viewing proof search as a process that takes a context, consumes part of it, and returns the rest (to be consumed elsewhere). In addition, we use collections of Kripke interpretations indexed by a commutative monoid to provide models for this logic programming language and show that logic programs admit a canonical model.
A SemanticHeadDriven Generation Algorithm for UnificationBased Formalisms
 IN 27TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS
, 1989
"... We present an algorithm for generating strings from logical form encodings that improves upon previous algorithms in that it places fewer restrictions on the class of grammars to which it is applicable. In particular, unlike an Earley deduction generator (Shieber, 1988), it allows use of semanticall ..."
Abstract

Cited by 42 (8 self)
 Add to MetaCart
We present an algorithm for generating strings from logical form encodings that improves upon previous algorithms in that it places fewer restrictions on the class of grammars to which it is applicable. In particular, unlike an Earley deduction generator (Shieber, 1988), it allows use of semantically nonmonotonic grammars, yet unlike topdown methods, it also permits leftrecursion. The enabling design feature of the algorithm is its implicit traversal of the analysis tree for the string being generated in a semanticheaddriven fashion.
Specifying FillerGap Dependency Parsers in a LinearLogic Programming Language
 Proceedings of the Joint International Conference and Symposium on Logic Programming
, 1992
"... An aspect of the Generalized Phrase Structure Grammar formalism proposed by Gazdar, et al. is the introduction of the notion of "slashed categories " to handle the parsing of structures, such as relative clauses, which involve unbounded dependencies. This has been implemented in Definite C ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
(Show Context)
An aspect of the Generalized Phrase Structure Grammar formalism proposed by Gazdar, et al. is the introduction of the notion of "slashed categories " to handle the parsing of structures, such as relative clauses, which involve unbounded dependencies. This has been implemented in Definite Clause Grammars through the technique of gap threading, in which a difference list of extracted noun phrases (gaps) is maintained. However, this technique is cumbersome, and can result in subtle soundness problems in the implemented grammars. Miller and Pareschi have proposed a method of implementing gap threading at the logical level in intuitionistic logic. Unfortunately that implementation itself suffered from serious problems, which the authors recognized. This paper builds on work first presented with Miller in which we developed a fillergap dependency parser in Girard's linear logic. This implementation suffers from none of the pitfalls of either the traditional implementation, or the intuitioni...
An O(n³) AgendaBased Chart Parser for Arbitrary Probabilistic ContextFree Grammars
, 2001
"... While O(n³) methods for parsing probabilistic contextfree grammars (PCFGs) are well known, a tabular parsing framework for arbitrary PCFGs which allows for bottonup, topdown, and other parsing strategies, has not yet been provided. This paper presents such an algorithm, and shows its correctness ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
While O(n³) methods for parsing probabilistic contextfree grammars (PCFGs) are well known, a tabular parsing framework for arbitrary PCFGs which allows for bottonup, topdown, and other parsing strategies, has not yet been provided. This paper presents such an algorithm, and shows its correctness and advantages over prior work. The paper finishes by bringing out the connections between the algorithm and work on hypergraphs, which permits us to extend the presented Viterbi (best parse) algorithm to an inside (total probability) algorithm.
GTU  A workbench for the development of natural language grammars
, 1995
"... In this report we present a Prolog tool for the development and testing of natural language grammars called GTU (German: GrammatikTestumgebung; grammar test environment). GTU offers a windoworiented user interface that allows the development and testing of natural language grammars under three for ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
In this report we present a Prolog tool for the development and testing of natural language grammars called GTU (German: GrammatikTestumgebung; grammar test environment). GTU offers a windoworiented user interface that allows the development and testing of natural language grammars under three formalisms. In particular it contains a collection of German test sentences and two types of German lexicons. Both of the lexicons can be adapted to a given grammar via an integrated lexicon interface. GTU has been implemented in Prolog both under DOS and UNIX. It was originally developed as a tutoring tool to support university courses on syntax analysis but in its UNIXversion it allows for the development of large grammars. 1 Introduction Any computer system that analyses natural language input (be it a grammar checker or a tool for machine aided translation or the like) needs a formal grammar in order to map the input to a structure that groups it in some meaningful way. However, it is by ...
Compiling Feature Structures into Terms: an Empirical Study in Prolog
 Recommendation ITUT H.262 (MPEG 2), International Standard ISO/IEC
, 1993
"... This paper explores the issue of feature structure representation and its effect on the efficiency of parsing within the context of prolog: it compares four different representation systems whilst holding the parsing strategy fixed. The representations are: i) interpreted patr equations; ii) partial ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This paper explores the issue of feature structure representation and its effect on the efficiency of parsing within the context of prolog: it compares four different representation systems whilst holding the parsing strategy fixed. The representations are: i) interpreted patr equations; ii) partially evaluated patr feature structures; iii) feature structures compiled to lists; and iv) feature structures compiled to flat terms. The parser uses a simple shiftreduce strategy with a grammar based loosely on Generalized Phrase Structure Grammar (gpsg). I begin by considering why partial information is important in computational linguistics and compare directed acyclic graphs (dags) and fixed arity terms as representations for partial information. I present details of the basic compiler and then go on to consider the four representations in detail. I assume a familiarity with prolog notation and with the basic notions of feature structure representation. An emergent property of the compila...
A Cognitive Model of Sentence Interpretation: the Construction Grammar approach
, 1993
"... This paper describes a new, psychologicallyplausible model of human sentence interpretation, based on a new model of linguistic structure, Construction Grammar. This online, parallel, probabilistic interpreter accounts for a wide variety of psycholinguistic results on lexical access, idiom process ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper describes a new, psychologicallyplausible model of human sentence interpretation, based on a new model of linguistic structure, Construction Grammar. This online, parallel, probabilistic interpreter accounts for a wide variety of psycholinguistic results on lexical access, idiom processing, parsing preferences, and studies of gapfilling and other valence ambiguities, including various frequency effects. We show that many of these results derive from the fundamental assumptions of Construction Grammar that lexical idioms, idioms, and syntactic structures are uniformly represented as grammatical constructions, and argue for the use of probabilisticallyenriched grammars and interpreters as models of human knowledge of and processing of language. Submitted to Cognitive Science ii 1 Introduction In the last twenty years, the field of cognitive science has seen an explosion in the number of computational models of cognitive processing. This is particularly true in the mode...
Logic Programming in a Fragment of Intuitionistic Linear Logic
, 1992
"... When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ..."
Abstract
 Add to MetaCart
(Show Context)
When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ∪ {D). Thus during the bottomup search for a cutfree proof contexts, represented as the lefthand side of intuitionistic sequents, grow as stacks. While such an intuitionistic notion of context provides for elegant specifications of many computations, contexts can be made more expressive and flexible if they are based on linear logic. After presenting two equivalent formulations of a fragment of linear logic, we show that the fragment has a goaldirected interpretation, thereby partially justifying calling it a logic programming language. Logic programs based on the intuitionistic theory of hereditary Harrop formulas can be modularly embedded into this linear logic setting. Programming examples taken from theorem proving, natural language parsing, and data base programming are presented: each example requires a linear, rather than intuitionistic, notion of context to be modeled adequately. An interpreter for this logic programming language must address the problem of splitting contexts; that is, when attempting to prove a multiplicative conjunction (tensor), say G1 ⊗ G2, from the context Δ, the latter must be split into disjoint contexts Δ1 and Δ2 for which G1 follows from Δ1 and G2
[ALEi The Attribute Logic Engine User's Guide
, 1994
"... NOTICE WARNING CONCERNING COPYRIGHT RESTRICTIONS: The copyright law of the United States (title 17, U.S. Code) governs the making of photocopies or other reproductions of copyrighted material. Any copying of this ..."
Abstract
 Add to MetaCart
NOTICE WARNING CONCERNING COPYRIGHT RESTRICTIONS: The copyright law of the United States (title 17, U.S. Code) governs the making of photocopies or other reproductions of copyrighted material. Any copying of this