Results 1  10
of
51
Relating Defeasible and Normal Logic Programming through Transformation Properties
, 2001
"... This paper relates the Defeasible Logic Programming (DeLP ) framework and its semantics SEM DeLP to classical logic programming frameworks. In DeLP we distinguish between two different sorts of rules: strict and defeasible rules. Negative literals (A) in these rules are considered to represent cl ..."
Abstract

Cited by 76 (31 self)
 Add to MetaCart
This paper relates the Defeasible Logic Programming (DeLP ) framework and its semantics SEM DeLP to classical logic programming frameworks. In DeLP we distinguish between two different sorts of rules: strict and defeasible rules. Negative literals (A) in these rules are considered to represent classical negation. In contrast to this, in normal logic programming (NLP ), there is only one kind of rules, but the meaning of negative literals (notA) is different: they represent a kind of negation as failure, and thereby introduce defeasibility. Various semantics have been defined for NLP, notably the wellfounded semantics WFS and the stable semantics Stable. In this paper we consider the transformation properties for NLP introduced by Brass and Dix and suitably adjusted for the DeLP framework. We show which transformation properties are satisfied, thereby identifying aspects in which NLP and DeLP differ. We contend that the transformation rules presented in this paper can he...
Knowledge Representation with Logic Programs
 DEPT. OF CS OF THE UNIVERSITY OF KOBLENZLANDAU
, 1996
"... In this tutorialoverview, which resulted from a lecture course given by the authors at ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
In this tutorialoverview, which resulted from a lecture course given by the authors at
Equivalence in answer set programming
 In Proc. LOPSTR 2001, LNCS 2372
, 2001
"... Abstract. We study the notion of strong equivalence between two Answer Set programs and we show how some particular cases of testing strong equivalence between programs can be reduced to verify if a formula is a theorem in intuitionistic or classical logic. We present some program transformations ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We study the notion of strong equivalence between two Answer Set programs and we show how some particular cases of testing strong equivalence between programs can be reduced to verify if a formula is a theorem in intuitionistic or classical logic. We present some program transformations for disjunctive programs, which can be used to simplify the structure of programs and reduce their size. These transformations are shown to be of interest for both computational and theoretical reasons. Then we propose how to generalize such transformations to deal with free programs (which allow the use of default negation in the head of clauses). We also present a linear time transformation that can reduce an augmented logic program (which allows nested expressions in both the head and body of clauses) to a program consisting only of standard disjunctive clauses and constraints. 1
The Computational Complexity of Ideal Semantics
"... We analyse the computational complexity of the recently proposed ideal semantics within both abstract argumentation frameworks (AFs) and assumptionbased argumentation frameworks (ABFs). It is shown that while typically less tractable than credulous admissibility semantics, the natural decision prob ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
(Show Context)
We analyse the computational complexity of the recently proposed ideal semantics within both abstract argumentation frameworks (AFs) and assumptionbased argumentation frameworks (ABFs). It is shown that while typically less tractable than credulous admissibility semantics, the natural decision problems arising with this extensionbased model can, perhaps surprisingly, be decided more efficiently than sceptical preferred semantics. In particular the task of finding the unique ideal extension is easier than that of deciding if a given argument is accepted under the sceptical semantics. We provide efficient algorithmic approaches for the class of bipartite argumentation frameworks and, finally, present a number of technical results which offer strong indications that typical problems in ideal argumentation are complete for the class P C   of languages decidable by polynomial time algorithms allowed to make nonadaptive queries to a C oracle, where C is an upper bound on the computational complexity of deciding credulous acceptance: C = NP for AFs and logic programming (LP) for ABFs modelling default theories. instantiations of ABFs; C = Σ p 2 Key words: Computational properties of argumentation; abstract argumentation
A General Theory of Confluent Rewriting Systems for Logic Programming and its Applications
, 2001
"... Recently, Brass and Dix showed (Journal of Automated Reasoning 20(1), 1998) that the wellfounded semantics WFS can be defined as a conuent calculus of transformation rules. This lead not only to a simple extension to disjunctive programs (Journal of Logic Programming 38(3), 1999), but also to a new ..."
Abstract

Cited by 22 (13 self)
 Add to MetaCart
Recently, Brass and Dix showed (Journal of Automated Reasoning 20(1), 1998) that the wellfounded semantics WFS can be defined as a conuent calculus of transformation rules. This lead not only to a simple extension to disjunctive programs (Journal of Logic Programming 38(3), 1999), but also to a new computation of the wellfounded semantics which is linear for a broad class of programs. We take this approach as a starting point and generalize it considerably by developing a general theory of Confluent LPSystems CS. Such a system CS is a rewriting system on the set of all logic programs over a fixed signature L and it induces in a natural way a canonical semantics. Moreover, we show four important applications of this theory: (1) most of the wellknown semantics are induced by confluent LPsystems, (2) there are many more transformation rules that lead to confluent LPsystems, (3) semantics induced by such systems can be used to model aggregation, (4) the new systems can be ...
Disjunctive Logic Programming: A Survey And Assessment
, 2002
"... We describe the elds of disjunctive logic programming and disjunctive deductive databases from the time of their inception to the current time. Contributions with respect to semantics, implementations and applications are surveyed. ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
We describe the elds of disjunctive logic programming and disjunctive deductive databases from the time of their inception to the current time. Contributions with respect to semantics, implementations and applications are surveyed.
K.: A theory of forgetting in logic programming
 In: AAAI
"... The study of forgetting for reasoning has attracted considerable attention in AI. However, much of the work on forgetting, and other related approaches such as independence, irrelevance and novelty, has been restricted to the classical logics. This paper describes a detailed theoretical investig ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
The study of forgetting for reasoning has attracted considerable attention in AI. However, much of the work on forgetting, and other related approaches such as independence, irrelevance and novelty, has been restricted to the classical logics. This paper describes a detailed theoretical investigation of the notion of forgetting in the context of logic programming. We first provide a semantic definition of forgetting under the answer sets for extended logic programs. We then discuss the desirable properties and some motivating examples. An important result of this study is an algorithm for computing the result of forgetting in a logic program. Furthermore, we present a modified version of the algorithm and show that the time complexity of the new algorithm is polynomial with respect to the size of the given logic program if the size of certain rules is fixed. We show how the proposed theory of forgetting can be used to characterize the logic program updates.
Advanced Preprocessing for Answer Set Solving
"... Abstract. We introduce the first substantial approach to preprocessing in the context of answer set solving. The idea is to simplify a logic program while identifying equivalences among its relevant constituents. These equivalences are then used for building a compact representation of the program ( ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce the first substantial approach to preprocessing in the context of answer set solving. The idea is to simplify a logic program while identifying equivalences among its relevant constituents. These equivalences are then used for building a compact representation of the program (in terms of Boolean constraints). We implemented our approach as well as a SATbased technique to reduce Boolean constraints. This allows us to empirically analyze both preprocessing types and to demonstrate their computational impact. 1
Improving the Alternating Fixpoint: The Transformation Approach
, 1997
"... . We present a bottomup algorithm for the computation of the wellfounded model of nondisjunctive logic programs which is based on the set of elementary program transformations studied by Brass and Dix [4, 5]. The transformation approach has been introduced in more detail in [7]. In this paper we ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
. We present a bottomup algorithm for the computation of the wellfounded model of nondisjunctive logic programs which is based on the set of elementary program transformations studied by Brass and Dix [4, 5]. The transformation approach has been introduced in more detail in [7]. In this paper we present a deeper analysis of its complexity and describe an optimized SCCoriented evaluation. We show that by our method no more work is done than by the alternating fixpoint procedure [23, 24] and that there are examples where our algorithm is significantly superior. 1 Introduction It is likely that the next generation of deductive database systems will support the full class of normal programs and that the wellfounded semantics will be chosen by nearly all system designers, because it has a unique model. Whereas the SLGresolution of Chen and Warren [10, 11, 12] is an elaborate topdown method for the computation of the wellfounded model of a normal program that already led to a full...