Results 1  10
of
116
Logic Programming in a Fragment of Intuitionistic Linear Logic
"... When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ..."
Abstract

Cited by 303 (40 self)
 Add to MetaCart
When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form D ⊃ G from the context (set of formulas) Γ leads to an attempt to prove the goal G in the extended context Γ ∪ {D}. Thus during the bottomup search for a cutfree proof contexts, represented as the lefthand side of intuitionistic sequents, grow as stacks. While such an intuitionistic notion of context provides for elegant specifications of many computations, contexts can be made more expressive and flexible if they are based on linear logic. After presenting two equivalent formulations of a fragment of linear logic, we show that the fragment has a goaldirected interpretation, thereby partially justifying calling it a logic programming language. Logic programs based on the intuitionistic theory of hereditary Harrop formulas can be modularly embedded into this linear logic setting. Programming examples taken from theorem proving, natural language parsing, and data base programming are presented: each example requires a linear, rather than intuitionistic, notion of context to be modeled adequately. An interpreter for this logic programming language must address the problem of splitting contexts; that is, when attempting to prove a multiplicative conjunction (tensor), say G1 ⊗ G2, from the context ∆, the latter must be split into disjoint contexts ∆1 and ∆2 for which G1 follows from ∆1 and G2 follows from ∆2. Since there is an exponential number of such splits, it is important to delay the choice of a split as much as possible. A mechanism for the lazy splitting of contexts is presented based on viewing proof search as a process that takes a context, consumes part of it, and returns the rest (to be consumed elsewhere). In addition, we use collections of Kripke interpretations indexed by a commutative monoid to provide models for this logic programming language and show that logic programs admit a canonical model.
HiLog: A foundation for higherorder logic programming
 JOURNAL OF LOGIC PROGRAMMING
, 1993
"... We describe a novel logic, called HiLog, and show that it provides a more suitable basis for logic programming than does traditional predicate logic. HiLog has a higherorder syntax and allows arbitrary terms to appear in places where predicates, functions and atomic formulas occur in predicate calc ..."
Abstract

Cited by 213 (40 self)
 Add to MetaCart
We describe a novel logic, called HiLog, and show that it provides a more suitable basis for logic programming than does traditional predicate logic. HiLog has a higherorder syntax and allows arbitrary terms to appear in places where predicates, functions and atomic formulas occur in predicate calculus. But its semantics is firstorder and admits a sound and complete proof procedure. Applications of HiLog are discussed, including DCG grammars, higherorder and modular logic programming, and deductive databases.
Forum: A multipleconclusion specification logic
 Theoretical Computer Science
, 1996
"... The theory of cutfree sequent proofs has been used to motivate and justify the design of a number of logic programming languages. Two such languages, λProlog and its linear logic refinement, Lolli [15], provide for various forms of abstraction (modules, abstract data types, and higherorder program ..."
Abstract

Cited by 85 (11 self)
 Add to MetaCart
The theory of cutfree sequent proofs has been used to motivate and justify the design of a number of logic programming languages. Two such languages, λProlog and its linear logic refinement, Lolli [15], provide for various forms of abstraction (modules, abstract data types, and higherorder programming) but lack primitives for concurrency. The logic programming language, LO (Linear Objects) [2] provides some primitives for concurrency but lacks abstraction mechanisms. In this paper we present Forum, a logic programming presentation of all of linear logic that modularly extends λProlog, Lolli, and LO. Forum, therefore, allows specifications to incorporate both abstractions and concurrency. To illustrate the new expressive strengths of Forum, we specify in it a sequent calculus proof system and the operational semantics of a programming language that incorporates references and concurrency. We also show that the meta theory of linear logic can be used to prove properties of the objectlanguages specified in Forum.
Programming in an Integrated Functional and Logic Language
, 1999
"... Escher is a generalpurpose, declarative programming language that integrates the best features of both functional and logic programming languages. It has types and modules, higherorder and metaprogramming facilities, concurrency, and declarative input/output. The main design aim is to combine in ..."
Abstract

Cited by 65 (14 self)
 Add to MetaCart
Escher is a generalpurpose, declarative programming language that integrates the best features of both functional and logic programming languages. It has types and modules, higherorder and metaprogramming facilities, concurrency, and declarative input/output. The main design aim is to combine in a practical and comprehensive way the best ideas of existing functional and logic languages, such as Haskell and Godel. In fact, Escher uses the Haskell syntax and is most straightforwardly understood as an extension of Haskell. Consequently, this paper discusses Escher from this perspective. It provides an introduction to the Escher language, concentrating largely on the issue of programming style and the Escher programming idioms not provided by Haskell. Also the extra mechanisms needed to support these idioms are discussed.
HigherOrder Horn Clauses
 JOURNAL OF THE ACM
, 1990
"... A generalization of Horn clauses to a higherorder logic is described and examined as a basis for logic programming. In qualitative terms, these higherorder Horn clauses are obtained from the firstorder ones by replacing firstorder terms with simply typed #terms and by permitting quantification ..."
Abstract

Cited by 61 (20 self)
 Add to MetaCart
A generalization of Horn clauses to a higherorder logic is described and examined as a basis for logic programming. In qualitative terms, these higherorder Horn clauses are obtained from the firstorder ones by replacing firstorder terms with simply typed #terms and by permitting quantification over all occurrences of function symbols and some occurrences of predicate symbols. Several prooftheoretic results concerning these extended clauses are presented. One result shows that although the substitutions for predicate variables can be quite complex in general, the substitutions necessary in the context of higherorder Horn clauses are tightly constrained. This observation is used to show that these higherorder formulas can specify computations in a fashion similar to firstorder Horn clauses. A complete theorem proving procedure is also described for the extension. This procedure is obtained by interweaving higherorder unification with backchaining and goal reductions, and constitutes a higherorder generalization of SLDresolution. These results have a practical realization in the higherorder logic programming language called λProlog.
Unification of simply typed lambdaterms as logic programming
 In Eighth International Logic Programming Conference
, 1991
"... The unification of simply typed λterms modulo the rules of β and ηconversions is often called “higherorder ” unification because of the possible presence of variables of functional type. This kind of unification is undecidable in general and if unifiers exist, most general unifiers may not exist ..."
Abstract

Cited by 56 (3 self)
 Add to MetaCart
The unification of simply typed λterms modulo the rules of β and ηconversions is often called “higherorder ” unification because of the possible presence of variables of functional type. This kind of unification is undecidable in general and if unifiers exist, most general unifiers may not exist. In this paper, we show that such unification problems can be coded as a query of the logic programming language Lλ in a natural and clear fashion. In a sense, the translation only involves explicitly axiomatizing in Lλ the notions of equality and substitution of the simply typed λcalculus: the rest of the unification process can be viewed as simply an interpreter of Lλ searching for proofs using those axioms. 1
OracleBased Checking of Untrusted Software
, 2001
"... We present a variant of ProofCarrying Code (PCC) in which the trusted inference rules are represented as a higherorder logic program, the proof checker is replaced by a nondeterministic higherorder logic interpreter and the proof by an oracle implemented as a stream of bits that resolve the nondet ..."
Abstract

Cited by 55 (3 self)
 Add to MetaCart
We present a variant of ProofCarrying Code (PCC) in which the trusted inference rules are represented as a higherorder logic program, the proof checker is replaced by a nondeterministic higherorder logic interpreter and the proof by an oracle implemented as a stream of bits that resolve the nondeterministic interpretation choices. In this setting, ProofCarrying Code allows the receiver of the code the luxury of using nondeterminism in constructing a simple yet powerful checking procedure. This oraclebased variant of PCC is able to adapt quite naturally to situations when the property being checked is simple or there is a fairly directed search procedure for it. As an example, we demonstrate that if PCC is used to verify type safety of assembly language programs compiled from Java source programs, the oracles that are needed are on the average just 12% of the size of the code, which represents an improvement of a factor of 30 over previous syntactic representations of PCC proofs. ...
Declarative Programming in Escher
, 1995
"... ion) If t is a term of type fi and x ff is a variable of type ff, then x ff :t is a term of type ff ! fi. 4. (Application) If s is a term of type ff ! fi and t is a term of type ff, then (s t) is a term of type fi. 5. (Tupling) If t 1 ; : : : ; t n are terms of type ff 1 ; : : : ; ff n , respectivel ..."
Abstract

Cited by 55 (2 self)
 Add to MetaCart
ion) If t is a term of type fi and x ff is a variable of type ff, then x ff :t is a term of type ff ! fi. 4. (Application) If s is a term of type ff ! fi and t is a term of type ff, then (s t) is a term of type fi. 5. (Tupling) If t 1 ; : : : ; t n are terms of type ff 1 ; : : : ; ff n , respectively, for some n 0, then ! t 1 ; : : : ; t n ? is a term of type ff 1 \Theta : : : \Theta ff n . If n = 1, ! t 1 ? is defined to be t 1 . If n = 0, the term obtained is the empty tuple, !?, which is a term of type 1. A.1. MONOMORPHIC TYPE THEORY 93 A term of type o is called a formula. An occurrence of a variable x ff in a term is bound if it occurs within a subterm of the form x ff :t. Otherwise, the occurrence is free. A variable in a term is free if it has a free occurrence. Similarly, a variable in a term is bound if it has a bound occurrence. Of course, a variable can be both bound and free in a term. A term is closed if it contains no free variable. I adopt various standard synta...
Revisiting Catamorphisms over Datatypes with Embedded Functions (or, Programs from Outer Space)
 In Conf. Record 23rd ACM SIGPLAN/SIGACT Symp. on Principles of Programming Languages, POPL’96, St. Petersburg Beach
, 1996
"... We revisit the work of Paterson and of Meijer & Hutton, which describes how to construct catamorphisms for recursive datatype definitions that embed contravariant occurrences of the type being defined. Their construction requires, for each catamorphism, the definition of an anamorphism that has an i ..."
Abstract

Cited by 55 (3 self)
 Add to MetaCart
We revisit the work of Paterson and of Meijer & Hutton, which describes how to construct catamorphisms for recursive datatype definitions that embed contravariant occurrences of the type being defined. Their construction requires, for each catamorphism, the definition of an anamorphism that has an inverselike relationship to that catamorphism. We present an alternative construction, which replaces the stringent requirement that an inverse anamorphism be defined for each catamorphism with a more lenient restriction. The resulting construction has a more efficient implementation than that of Paterson, Meijer, and Hutton and the relevant restriction can be enforced by a HindleyMilner type inference algorithm. We provide numerous examples illustrating our method. 1 Introduction Functional programmers often use catamorphisms (or fold functions) as an elegant means of expressing algorithms over algebraic datatypes. Catamorphisms have also been used by functional programmers as a medium in ...