Results 1  10
of
41
Splitting a Logic Program
 Principles of Knowledge Representation
, 1994
"... In many cases, a logic program can be divided into two parts, so that one of them, the \bottom " part, does not refer to the predicates de ned in the \top " part. The \bottom " rules can be used then for the evaluation of the predicates that they de ne, and the computed va ..."
Abstract

Cited by 286 (15 self)
 Add to MetaCart
(Show Context)
In many cases, a logic program can be divided into two parts, so that one of them, the \bottom &quot; part, does not refer to the predicates de ned in the \top &quot; part. The \bottom &quot; rules can be used then for the evaluation of the predicates that they de ne, and the computed values can be used to simplify the \top &quot; de nitions. We discuss this idea of splitting a program in the context of the answer set semantics. The main theorem shows how computing the answer sets for a program can be simpli ed when the program is split into parts. The programs covered by the theorem may use both negation as failure and classical negation, and their rules may have disjunctive heads. The usefulness of the concept of splitting for the investigation of answer sets is illustrated by several applications. First, we show that a conservative extension theorem by Gelfond and Przymusinska and a theorem on the closed world assumption by Gelfond and Lifschitz are easy consequences of the splitting theorem. Second, (locally) strati ed programs are shown to have a simple characterization in terms of splitting. The existence and uniqueness of an answer set for such a program can be easily derived from this characterization. Third, we relate the idea of splitting to the notion of orderconsistency. 1
Fixpoint semantics for logic programming  a survey
, 2000
"... The variety of semantical approaches that have been invented for logic programs is quite broad, drawing on classical and manyvalued logic, lattice theory, game theory, and topology. One source of this richness is the inherent nonmonotonicity of its negation, something that does not have close para ..."
Abstract

Cited by 126 (0 self)
 Add to MetaCart
The variety of semantical approaches that have been invented for logic programs is quite broad, drawing on classical and manyvalued logic, lattice theory, game theory, and topology. One source of this richness is the inherent nonmonotonicity of its negation, something that does not have close parallels with the machinery of other programming paradigms. Nonetheless, much of the work on logic programming semantics seems to exist side by side with similar work done for imperative and functional programming, with relatively minimal contact between communities. In this paper we summarize one variety of approaches to the semantics of logic programs: that based on fixpoint theory. We do not attempt to cover much beyond this single area, which is already remarkably fruitful. We hope readers will see parallels with, and the divergences from the better known fixpoint treatments developed for other programming methodologies.
A Coinductive Calculus of Component Connectors
, 2002
"... Reo is a recently introduced channelbased coordination model, wherein complex coordinators, called connectors, are compositionally built out of simpler ones. Using a more liberal notion of a channel, Reo generalises existing dataflow networks. In this paper, we present a simple and transparent sema ..."
Abstract

Cited by 71 (29 self)
 Add to MetaCart
(Show Context)
Reo is a recently introduced channelbased coordination model, wherein complex coordinators, called connectors, are compositionally built out of simpler ones. Using a more liberal notion of a channel, Reo generalises existing dataflow networks. In this paper, we present a simple and transparent semantical model for Reo, in which connectors are relations on timed data streams. Timed data streams constitute a characteristic of our model and consist of twin pairs of separate data and time streams. Furthermore, coinduction is our main reasoning principle and we use it to prove properties such as connector equivalence.
Extending Classical Logic with Inductive Definitions
, 2000
"... The goal of this paper is to extend classical logic with a generalized notion of inductive definition supporting positive and negative induction, to investigate the properties of this logic, its relationships to other logics in the area of nonmonotonic reasoning, logic programming and deductiv ..."
Abstract

Cited by 68 (45 self)
 Add to MetaCart
The goal of this paper is to extend classical logic with a generalized notion of inductive definition supporting positive and negative induction, to investigate the properties of this logic, its relationships to other logics in the area of nonmonotonic reasoning, logic programming and deductive databases, and to show its application for knowledge representation by giving a typology of definitional knowledge.
The Family of Stable Models
, 1993
"... The family of all stable models for a logic program has a surprisingly simple overall structure, once two naturally occurring orderings are made explicit. In a socalled knowledge ordering based on degree of definedness, every logic program P has a smallest stable model, s k P it is the well ..."
Abstract

Cited by 62 (4 self)
 Add to MetaCart
The family of all stable models for a logic program has a surprisingly simple overall structure, once two naturally occurring orderings are made explicit. In a socalled knowledge ordering based on degree of definedness, every logic program P has a smallest stable model, s k P it is the wellfounded model. There is also a dual largest stable model, S k P , which has not been considered before. There is another ordering based on degree of truth. Taking the meet and the join, in the truth ordering, of the two extreme stable models s k P and S k P just mentioned, yields the alternating fixed points of [29], denoted s t P and S t P here. From s t P and S t P in turn, s k P and S k P can be produced again, using the meet and join of the knowledge ordering. All stable models are bounded by these four valuations. Further, the methods of proof apply not just to logic programs considered classically, but to logic programs over any bilattice meeting certain co...
A Predicative Analysis of Structural Recursion
, 1999
"... We introduce a language based upon lambda calculus with products, coproducts and strictly positive inductive types that allows the definition of recursive terms. We present the implementation (foetus) of a syntactical check that ensures that all such terms are structurally recursive, i.e., recursive ..."
Abstract

Cited by 44 (20 self)
 Add to MetaCart
We introduce a language based upon lambda calculus with products, coproducts and strictly positive inductive types that allows the definition of recursive terms. We present the implementation (foetus) of a syntactical check that ensures that all such terms are structurally recursive, i.e., recursive calls appear only with arguments structurally smaller than the input parameters of terms considered. To ensure the correctness of the termination checker, we show that all structurally recursive terms are normalizing with respect to a given operational semantics. To this end, we define a semantics on all types and a structural ordering on the values in this semantics and prove that all values are accessible with regard to this ordering. Finally, we point out how to do this proof predicatively using set based operators.
Logic programming revisited: logic programs as inductive definitions
 ACM Transactions on Computational Logic
, 2001
"... Logic programming has been introduced as programming in the Horn clause subset of first order logic. This view breaks down for the negation as failure inference rule. To overcome the problem, one line of research has been to view a logic program as a set of iffdefinitions. A second approach was to ..."
Abstract

Cited by 44 (25 self)
 Add to MetaCart
(Show Context)
Logic programming has been introduced as programming in the Horn clause subset of first order logic. This view breaks down for the negation as failure inference rule. To overcome the problem, one line of research has been to view a logic program as a set of iffdefinitions. A second approach was to identify a unique canonical, preferred or intended model among the models of the program and to appeal to common sense to validate the choice of such model. Another line of research developed the view of logic programming as a nonmonotonic reasoning formalism strongly related to Default Logic and Autoepistemic Logic. These competing approaches have resulted in some confusion about the declarative meaning of logic programming. This paper investigates the problem and proposes an alternative epistemological foundation for the canonical model approach, which is not based on common sense but on a solid mathematical information principle. The thesis is developed that logic programming can be understood as a natural and general logic of inductive definitions. In particular, logic programs with negation represent nonmonotone inductive definitions. It is argued that this thesis results in an alternative justification of the wellfounded model as the unique intended model of the logic program. In addition, it equips logic programs with an easy to comprehend meaning
A FixedPoint Approach to Stable Matchings and Some Applications
, 2001
"... We describe a fixedpoint based approach to the theory of bipartite stable matchings. By this, we provide a common framework that links together seemingly distant results, like the stable marriage theorem of Gale and Shapley [11], the MenelsohnDulmage theorem [21], the KunduLawler theorem [19], Ta ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
We describe a fixedpoint based approach to the theory of bipartite stable matchings. By this, we provide a common framework that links together seemingly distant results, like the stable marriage theorem of Gale and Shapley [11], the MenelsohnDulmage theorem [21], the KunduLawler theorem [19], Tarski's fixed point theorem [32], the CantorBernstein theorem, Pym's linking theorem [22, 23] or the monochromatic path theorem of Sands et al. [29]. In this framework, we formulate a matroidgeneralization of the stable marriage theorem and study the lattice structure of generalized stable matchings. Based on the theory of lattice polyhedra and blocking polyhedra, we extend results of Vande Vate [33] and Rothblum [28] on the bipartite stable matching polytope.
A Logic of NonMonotone Inductive Definitions and its Modularity Properties
, 2004
"... Wellknown principles of induction include monotone induction and dierent sorts of nonmonotone induction such as inationary induction, induction over wellordered sets and iterated induction. In this work, we de ne a logic formalizing induction over wellordered sets and monotone and iterated ..."
Abstract

Cited by 33 (24 self)
 Add to MetaCart
Wellknown principles of induction include monotone induction and dierent sorts of nonmonotone induction such as inationary induction, induction over wellordered sets and iterated induction. In this work, we de ne a logic formalizing induction over wellordered sets and monotone and iterated induction. Just as the principle of positive induction has been formalized in FO(LFP), and the principle of inationary induction has been formalized in FO(IFP), this paper formalizes the principle of iterated induction in a new logic for NonMonotone Inductive De nitions (NMIDlogic). The semantics of the logic is strongly inuenced by the wellfounded semantics of logic programming.
Symbolic Transition Graph with Assignment
, 1996
"... A new model for messagepassing processes is proposed which generalizes the notion of symbolic transition graph as introduced in [HL95], by allowing assignments to be carried in transitions. The main advantage of this generalization is that a wider class of processes can be represented as finite sta ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
(Show Context)
A new model for messagepassing processes is proposed which generalizes the notion of symbolic transition graph as introduced in [HL95], by allowing assignments to be carried in transitions. The main advantage of this generalization is that a wider class of processes can be represented as finite state graphs. Two kinds of operational semantics, ground and symbolic, are given to such graphs. On top of them both ground and symbolic bisimulations are defined and are shown to agree with each other.