Results 1  10
of
11
Logic Programming in a Fragment of Intuitionistic Linear Logic
"... When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ..."
Abstract

Cited by 303 (40 self)
 Add to MetaCart
When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ∪ {D}. Thus during the bottomup search for a cutfree proof contexts, represented as the lefthand side of intuitionistic sequents, grow as stacks. While such an intuitionistic notion of context provides for elegant specifications of many computations, contexts can be made more expressive and flexible if they are based on linear logic. After presenting two equivalent formulations of a fragment of linear logic, we show that the fragment has a goaldirected interpretation, thereby partially justifying calling it a logic programming language. Logic programs based on the intuitionistic theory of hereditary Harrop formulas can be modularly embedded into this linear logic setting. Programming examples taken from theorem proving, natural language parsing, and data base programming are presented: each example requires a linear, rather than intuitionistic, notion of context to be modeled adequately. An interpreter for this logic programming language must address the problem of splitting contexts; that is, when attempting to prove a multiplicative conjunction (tensor), say G1 ⊗ G2, from the context ∆, the latter must be split into disjoint contexts ∆1 and ∆2 for which G1 follows from ∆1 and G2 follows from ∆2. Since there is an exponential number of such splits, it is important to delay the choice of a split as much as possible. A mechanism for the lazy splitting of contexts is presented based on viewing proof search as a process that takes a context, consumes part of it, and returns the rest (to be consumed elsewhere). In addition, we use collections of Kripke interpretations indexed by a commutative monoid to provide models for this logic programming language and show that logic programs admit a canonical model.
Extending definite clause grammars with scoping constructs
 7th Int. Conf. Logic Programming
, 1990
"... Definite Clause Grammars (DCGs) have proved valuable to computational linguists since they can be used to specify phrase structured grammars. It is well known how to encode DCGs in Horn clauses. Some linguistic phenomena, such as fillergap dependencies, are difficult to account for in a completely ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
Definite Clause Grammars (DCGs) have proved valuable to computational linguists since they can be used to specify phrase structured grammars. It is well known how to encode DCGs in Horn clauses. Some linguistic phenomena, such as fillergap dependencies, are difficult to account for in a completely satisfactory way using simple phrase structured grammar. In the literature of logic grammars there have been several attempts to tackle this problem by making use of special arguments added to the DCG predicates corresponding to the grammatical symbols. In this paper we take a different line, in that we account for fillergap dependencies by encoding DCGs within hereditary Harrop formulas, an extension of Horn clauses (proposed elsewhere as a foundation for logic programming) where implicational goals and universally quantified goals are permitted. Under this approach, fillergap dependencies can be accounted for in terms of the operational semantics underlying hereditary Harrop formulas, in a way reminiscent of the treatment of such phenomena in Generalized Phrase Structure Grammar (GPSG). The main features involved in this new formulation of DCGs are mechanisms for providing scope to constants and program clauses along with a mild use of λterms and λconversion. 1
Using Logic Programming Languages For Optical Music Recognition
 In Proceedings of the Third International Conference on The Practical Application of Prolog
, 1995
"... Optical Music Recognition is a particular form of document analysis in which there is much knowledge about document structure. Indeed there exists an important set of rules for musical notation, but current systems do not fully use them. We propose a new solution using a grammar to guide the segment ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Optical Music Recognition is a particular form of document analysis in which there is much knowledge about document structure. Indeed there exists an important set of rules for musical notation, but current systems do not fully use them. We propose a new solution using a grammar to guide the segmentation of the graphical objects and their recognition. The grammar is essentially a description of the relations (relative position and size, adjacency, etc) between the graphical objects. Inspired by Definite Clause Grammar techniques, the grammar can be directly implemented in Prolog, a higherorder dialect of Prolog. Moreover, the translation from the grammar into Prolog code can be done automatically. Our approach is justified by the first encouraging results obtained with a prototype for music score recognition. Keywords: Document analysis, Optical Music Recognition, DCG, Grammar Translation 1 Introduction In structured document analysis, one open problem is to separate knowledge from...
Multilanguage Machine Translation through Document Normalization
 the proceedings of the EACL’03 EAMT workshop
, 2003
"... Document normalization is an interactive process that transforms raw legacy documents into semantically wellformed and linguistically controlled documents with the same communicative intention content. A paradigm for content analysis has been implemented to select candidate semantic representations ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Document normalization is an interactive process that transforms raw legacy documents into semantically wellformed and linguistically controlled documents with the same communicative intention content. A paradigm for content analysis has been implemented to select candidate semantic representations of the communicative content of an input document. This implementation reuses the formal content specification of a multilingual controlled authoring system. As a consequence, a candidate semantic representation can not only be associated with a text in the language of the input document, but also in all the languages supported by the system. This paper presents how multilingual versions of an input legacy document can be obtained interactively with a proposed implementation, and discusses the advantages and limitations of this kind of normalizing translation. 1
Engineering Transformations of Attributed Grammars in
 Joint Int. Conf. and Symp. Logic Programming
, 1996
"... An abstract representation for grammar rules that permits an easy implementation of several attributed grammar transformations is presented. It clearly separates the actions that contribute to evaluating attribute values from the circulation of these values, and it makes it easy to combine the repre ..."
Abstract
 Add to MetaCart
An abstract representation for grammar rules that permits an easy implementation of several attributed grammar transformations is presented. It clearly separates the actions that contribute to evaluating attribute values from the circulation of these values, and it makes it easy to combine the representations of several rules in order to build the representation of new rules. This abstract form applies well to such transforms as elimination of leftrecursion, elimination of empty derivation, unfolding and factorization. Finally, the technique is applied to DCGs and a Prolog implementation of the abstract form and of the transforms is described. 1 Introduction Textbooks (e.g., the Dragon book [1], and others [2, 3]) are rich and precise when dealing with the manipulations (analysis, transformation) of pure ContextFree Grammars (CFGs). They also present parsing techniques for Attributed CFGs (ACFGs), and specialized analysis for ACFGs like the detection of circular dependencies. How...
Prolog/Mali Reference Manual
"... ion is a kind of quantification: the quantification. It has strong connections with universal quantification which are developed in the sequel. As a quantification, abstraction gives rise to the usual notions of free and bound variables. ffl If E and F are in L, then (E F ) is an application in L ..."
Abstract
 Add to MetaCart
ion is a kind of quantification: the quantification. It has strong connections with universal quantification which are developed in the sequel. As a quantification, abstraction gives rise to the usual notions of free and bound variables. ffl If E and F are in L, then (E F ) is an application in L. Application is supposed to associate to the left so that nested applications (: : : ((a 1 a 2 ) a 3 ) : : : an ) are written (a 1 a 2 a 3 : : : an ), (a n ), or (a) if the number of individuals does not matter 3 . ffl There is a typing function from L to T that satisfies rules (x:E) = (x) ! (E) and 9ff: ( (E) = ff ! fi) ( (F ) = ff) , ((E F )) = fi A constant that is given a predicate type is called a predicate constant. Example 1.2.2 A term x:E with type ff ! fi can be interpreted as a function with parameter x of type ff and result E of type fi. For instance, x:x with type ff ! ff is the identity function for terms having type ff. It is noted id ff . Concrete syntax for x:x ...
Using Logic Programming Languages For Structured Document Analysis: Application To Music Scores
"... We describe in this paper an original solution for structured document analysis. The idea is to use a grammar to guide the segmentation of the graphical objects and their recognition. The grammar is essentially a description of the relations (relative position and size, adjacency, etc) between t ..."
Abstract
 Add to MetaCart
We describe in this paper an original solution for structured document analysis. The idea is to use a grammar to guide the segmentation of the graphical objects and their recognition. The grammar is essentially a description of the relations (relative position and size, adjacency, etc) between the graphical objects.
A Definite Clause Version of Categorial Grammar
, 1988
"... We introduce a firstorder version of Categorial Grammar, based on the idea of encoding syntactic types as definite clauses. Thus, we drop all explicit requirements of adjacency between combinable constituents, and we capture wordorder constraints simply by allowing subformulae of complex types to s ..."
Abstract
 Add to MetaCart
We introduce a firstorder version of Categorial Grammar, based on the idea of encoding syntactic types as definite clauses. Thus, we drop all explicit requirements of adjacency between combinable constituents, and we capture wordorder constraints simply by allowing subformulae of complex types to share variables ranging over string positions. We are in this way able to account for constructio6s involving discontin uous constituents. Such constructions are difficult to handle in the more traditional version of Cate gorial Grammar, which is based on propositional types and on the requirement of strict string adjacency between combinable constituents.
Reversing Controlled Document Authoring to Normalize Documents
, 2003
"... This paper introduces document nor malization, and addresses the issue of whether controlled document authoring systems can be used in a reverse mode to normalize legacy documents. A paradigm for deep content analysis using such a system is proposed, and an architecture for a document normal ..."
Abstract
 Add to MetaCart
This paper introduces document nor malization, and addresses the issue of whether controlled document authoring systems can be used in a reverse mode to normalize legacy documents. A paradigm for deep content analysis using such a system is proposed, and an architecture for a document normaliza tion system is described.
A LogicBased Integration of Query Processing and Knowledge Discovery
, 2004
"... Databases have withstood the test of time and are now indispensable components of the majority of applications. One of the challenges that technological developments throughout the last decade have posed database research is that of eliciting novel and potentially useful information from very many, ..."
Abstract
 Add to MetaCart
Databases have withstood the test of time and are now indispensable components of the majority of applications. One of the challenges that technological developments throughout the last decade have posed database research is that of eliciting novel and potentially useful information from very many, very large, databases, i.e., knowledge that lies in the data but is, in general, irretrievable via classical query languages. At present, this challenge...