Results 1  10
of
21
Logic Programming in a Fragment of Intuitionistic Linear Logic
"... When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ..."
Abstract

Cited by 303 (40 self)
 Add to MetaCart
When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ∪ {D}. Thus during the bottomup search for a cutfree proof contexts, represented as the lefthand side of intuitionistic sequents, grow as stacks. While such an intuitionistic notion of context provides for elegant specifications of many computations, contexts can be made more expressive and flexible if they are based on linear logic. After presenting two equivalent formulations of a fragment of linear logic, we show that the fragment has a goaldirected interpretation, thereby partially justifying calling it a logic programming language. Logic programs based on the intuitionistic theory of hereditary Harrop formulas can be modularly embedded into this linear logic setting. Programming examples taken from theorem proving, natural language parsing, and data base programming are presented: each example requires a linear, rather than intuitionistic, notion of context to be modeled adequately. An interpreter for this logic programming language must address the problem of splitting contexts; that is, when attempting to prove a multiplicative conjunction (tensor), say G1 ⊗ G2, from the context ∆, the latter must be split into disjoint contexts ∆1 and ∆2 for which G1 follows from ∆1 and G2 follows from ∆2. Since there is an exponential number of such splits, it is important to delay the choice of a split as much as possible. A mechanism for the lazy splitting of contexts is presented based on viewing proof search as a process that takes a context, consumes part of it, and returns the rest (to be consumed elsewhere). In addition, we use collections of Kripke interpretations indexed by a commutative monoid to provide models for this logic programming language and show that logic programs admit a canonical model.
Linear Objects: logical processes with builtin inheritance
, 1990
"... We present a new framework for amalgamating two successful programming paradigms: logic programming and objectoriented programming. From the former, we keep the declarative reading of programs. From the latter, we select two crucial notions: (i) the ability for objects to dynamically change their ..."
Abstract

Cited by 207 (6 self)
 Add to MetaCart
We present a new framework for amalgamating two successful programming paradigms: logic programming and objectoriented programming. From the former, we keep the declarative reading of programs. From the latter, we select two crucial notions: (i) the ability for objects to dynamically change their internal state during the computation; (ii) the structured representation of knowledge, generally obtained via inheritance graphs among classes of objects. We start with the approach, introduced in concurrent logic programming languages, which identifies objects with proof processes and object states with arguments occurring in the goals of a given process. This provides a clean, sideeffect free account of the dynamic behavior of objects in terms of the search tree  the only dynamic entity in logic programming languages. We integrate this view of objects with an extension of logic programming, which we call Linear Objects, based on the possibility of having multiple literals in the head of a program clause. This contains within itself the basis for a flexible form of inheritance, and maintains the constructive property of Prolog of returning definite answer substitutions as output of the proof of nonground goals. The theoretical background for Linear Objects is Linear Logic, a logic recently introduced to provide a theoretical basis for the study of concurrency. We also show that Linear Objects can be considered a constructive restriction of full Classical Logic. We illustrate the expressive power of Linear Objects compared to Prolog by several examples from the objectoriented domain, but we also show that it can be used to provide elegant solutions for problems arising in the standard style of logic programming.
Extending definite clause grammars with scoping constructs
 7th Int. Conf. Logic Programming
, 1990
"... Definite Clause Grammars (DCGs) have proved valuable to computational linguists since they can be used to specify phrase structured grammars. It is well known how to encode DCGs in Horn clauses. Some linguistic phenomena, such as fillergap dependencies, are difficult to account for in a completely ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
Definite Clause Grammars (DCGs) have proved valuable to computational linguists since they can be used to specify phrase structured grammars. It is well known how to encode DCGs in Horn clauses. Some linguistic phenomena, such as fillergap dependencies, are difficult to account for in a completely satisfactory way using simple phrase structured grammar. In the literature of logic grammars there have been several attempts to tackle this problem by making use of special arguments added to the DCG predicates corresponding to the grammatical symbols. In this paper we take a different line, in that we account for fillergap dependencies by encoding DCGs within hereditary Harrop formulas, an extension of Horn clauses (proposed elsewhere as a foundation for logic programming) where implicational goals and universally quantified goals are permitted. Under this approach, fillergap dependencies can be accounted for in terms of the operational semantics underlying hereditary Harrop formulas, in a way reminiscent of the treatment of such phenomena in Generalized Phrase Structure Grammar (GPSG). The main features involved in this new formulation of DCGs are mechanisms for providing scope to constants and program clauses along with a mild use of λterms and λconversion. 1
Objects in Forum
 In Proceedings of the International Logic Programming Symposium
, 1995
"... A logical characterization of the typical features of objectoriented languages could yield a clear semantical counterpart of their operational meaning and, at the same time, it could allow to define a logic programming language in which it is possible to reason over highlycomplex data structures. ..."
Abstract

Cited by 24 (9 self)
 Add to MetaCart
A logical characterization of the typical features of objectoriented languages could yield a clear semantical counterpart of their operational meaning and, at the same time, it could allow to define a logic programming language in which it is possible to reason over highlycomplex data structures. Many approaches to this problem have been proposed in the last years. Classical logic turned out to be unsuitable to model complex mechanisms, such as the dynamic modifications of the state of the objects, in a satisfactory way. Girard's Linear Logic [5] provides the means to handle many operational aspects of programming languages from a proof theoretical perspective as shown by Andreoli and Pareschi in [2]. In the paper Forum [11], a presentation of higherorder linear logic, is specialized to deal with statebased systems according to the proof as computation perspective. In this setting it is possible to represent a concrete notion of object assigning a logical meaning to features like ...
Backtrackable State with Linear Assumptions, Continuations and Hidden Accumulator Grammars
"... A set of executable specifications and efficient implementations of backtrackable state persisting over the current ANDcontinuation is investigated. At specification level, our primitive operations are a variant of linear and intuitionistic implications, having as consequent the current continuati ..."
Abstract

Cited by 18 (11 self)
 Add to MetaCart
A set of executable specifications and efficient implementations of backtrackable state persisting over the current ANDcontinuation is investigated. At specification level, our primitive operations are a variant of linear and intuitionistic implications, having as consequent the current continuation. On top of them, we introduce a form of hypothetical assumptions which use no explicit quantifiers and have an easy and efficient implementation on top of logic programming systems featuring backtrackable destructive assignment, global variables and simple specifications in term of translation to sideeffect free Prolog. A variant of Extended DCGs handling multiple streams without the need of a preprocessing technique, Hidden Accumulator Grammars (HAGs), are specified in terms of linear assumptions. For HAGs, efficiency comparable to that of preprocessing techniques is obtained through a WAMlevel implementation of backtrackable destructive assignment, supporting nondeterministic execut...
Methods as Assertions
, 1994
"... . A method definition can be viewed as a logical assertion. Whenever we declare a method as the implementation of an operation, we assert that if the operation is invoked on objects of the appropriate types then the method body will satisfy the specification of the operation. This view of methods as ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
. A method definition can be viewed as a logical assertion. Whenever we declare a method as the implementation of an operation, we assert that if the operation is invoked on objects of the appropriate types then the method body will satisfy the specification of the operation. This view of methods as assertions is simple but general. Among its applications are: methods defined on interfaces as well as on classes; an elementary type system for objects that handles multimethods; and a mechanism for method dispatch based on the desired output type as well as on the types of arguments. Further, these applications are compatible with traditional execution models and implementation techniques. Logical reasoning about methods plays a role at compile time, then gets out of the way. 1 Introduction An object is commonly characterized as a collection of data together with associated procedures, called methods. Each method implements an operation on the object; an operation may have other impleme...
Elimination of Negation in a Logical Framework
, 2000
"... Logical frameworks with a logic programming interpretation such as hereditary Harrop formulae (HHF) [15] cannot express directly negative information, although negation is a useful specification tool. Since negationasfailure does not fit well in a logical framework, especially one endowed with ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Logical frameworks with a logic programming interpretation such as hereditary Harrop formulae (HHF) [15] cannot express directly negative information, although negation is a useful specification tool. Since negationasfailure does not fit well in a logical framework, especially one endowed with hypothetical and parametric judgements, we adapt the idea of elimination of negation introduced in [21] for Horn logic to a fragment of higherorder HHF. This entails finding a middle ground between the Closed World Assumption usually associated with negation and the Open World Assumption typical of logical frameworks; the main technical idea is to isolate a set of programs where static and dynamic clauses do not overlap.
I+: A Multiparadigm Language for ObjectOriented Declarative Programming
 Computer Languages
, 1995
"... This paper presents a multiparadigm language I + which is an integration of the three major programming paradigms: objectoriented, logic and functional. I + has an objectoriented framework in which the notions of classes, objects, methods, inheritance and message passing are supported. Methods m ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
This paper presents a multiparadigm language I + which is an integration of the three major programming paradigms: objectoriented, logic and functional. I + has an objectoriented framework in which the notions of classes, objects, methods, inheritance and message passing are supported. Methods may be specified as clauses or functions, thus the two declarative paradigms are incorporated at the method level of the objectoriented paradigm. In addition, two levels of parallelism may be exploited in I + programming. Therefore I + is a multiparadigm language for objectoriented declarative programming as well as parallel programming. Keywords: Multiparadigm, Objectoriented paradigm , Logic paradigm , Functional paradigm 1 Introduction A multiparadigm language is a language that supports more than one programming paradigm. Multiparadigm languages are desirable for the following reasons: . A programmer can choose the most appropriate paradigm for a particular problem at hand so that...
Object Calculi in Linear Logic
"... Several calculi of objects have been studied in the recent literature, that support the central features of objectbased languages: messages, inheritance, dynamic dispatch, object update and objectextension. We show that a complete semantic account of these features may be given in a fragment of hi ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Several calculi of objects have been studied in the recent literature, that support the central features of objectbased languages: messages, inheritance, dynamic dispatch, object update and objectextension. We show that a complete semantic account of these features may be given in a fragment of higherorder linear logic.