Results 1  10
of
77
Complexity and Expressive Power of Logic Programming
, 1997
"... This paper surveys various complexity results on different forms of logic programming. The main focus is on decidable forms of logic programming, in particular, propositional logic programming and datalog, but we also mention general logic programming with function symbols. Next to classical results ..."
Abstract

Cited by 279 (56 self)
 Add to MetaCart
This paper surveys various complexity results on different forms of logic programming. The main focus is on decidable forms of logic programming, in particular, propositional logic programming and datalog, but we also mention general logic programming with function symbols. Next to classical results on plain logic programming (pure Horn clause programs), more recent results on various important extensions of logic programming are surveyed. These include logic programming with different forms of negation, disjunctive logic programming, logic programming with equality, and constraint logic programming. The complexity of the unification problem is also addressed.
Prioritized Logic Programming and Its Application to Commonsense Reasoning
, 2000
"... Representing and reasoning with priorities are important in commonsense reasoning. This paper introduces a framework of prioritized logic programming (PLP), which has a mechanism of explicit representation of priority information in a program. When a program contains incomplete or indefinite informa ..."
Abstract

Cited by 41 (1 self)
 Add to MetaCart
Representing and reasoning with priorities are important in commonsense reasoning. This paper introduces a framework of prioritized logic programming (PLP), which has a mechanism of explicit representation of priority information in a program. When a program contains incomplete or indefinite information, PLP is useful for specifying preference to reduce nondeterminism in logic programming. Moreover, PLP can realize various forms of commonsense reasoning in AI such as abduction, default reasoning, circumscription, and their prioritized variants. The proposed framework increases the expressive power of logic programming and exploits new applications in knowledge representation. Keywords: prioritized logic programs, abduction, default reasoning, prioritized circumscription 1 Introduction In commonsense reasoning a theory is usually assumed incomplete and may contain indefinite or conflicting knowledge. Under such circumstances, priority information is useful to select appropriate know...
Possible Model Semantics for Disjunctive Databases
, 1989
"... This paper presents a novel approach to the semantics of deductive databases. The possible model semantics is introduced as an alternative approach to the classical minimal model semantics. The possible model semantics can distinguish both inclusive and exclusive disjunctions, and provide a flexible ..."
Abstract

Cited by 41 (4 self)
 Add to MetaCart
This paper presents a novel approach to the semantics of deductive databases. The possible model semantics is introduced as an alternative approach to the classical minimal model semantics. The possible model semantics can distinguish both inclusive and exclusive disjunctions, and provide a flexible mechanism for inferring negation in disjunctive databases. The possible model semantics is characterized by a new fixpoint semantics of disjunctive databases. A proof procedure called the SLDP resolution is presented and shown to be sound and complete with respect to the possible model semantics. 1 Introduction The declarative semantics of logic programming and deductive databases has been widely studied based on the minimal model semantics by incorporating an inference rule for negation. For definite Horn databases, the unique least Herbrand model gives a declarative meaning of a program [VK76], and the closed world assumption (CWA) [Rei78] provides negative information as the facts fal...
Reflection in logic, functional and objectoriented programming: a short comparative study
 Proc. of the IJCAI’95 Workshop on Reflection and Metalevel Architectures andtheir Applications in AI,1995
"... Département d’informatique et de recherche opérationnelle ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
Département d’informatique et de recherche opérationnelle
Updating knowledge bases while maintaining their consistency
 VLDB J
, 1995
"... Abstract. When updating a knowledge base, several problems may arise. One of the most important problems is that of integrity constraints satisfaction. The classic approach to this problem has been to develop methods for checking whether a given update violates an integrity constraint. An alternativ ..."
Abstract

Cited by 29 (14 self)
 Add to MetaCart
Abstract. When updating a knowledge base, several problems may arise. One of the most important problems is that of integrity constraints satisfaction. The classic approach to this problem has been to develop methods for checking whether a given update violates an integrity constraint. An alternative approach consists of trying to repair integrity constraints violations by performing additional updates that maintain knowledge base consistency. Another major problem in knowledge base updating is that of view updating; which determines how an update request should be translated into an update of the underlying base facts. We propose a new method for updating knowledge bases while maintaining their consistency. Our method can be used for both integrity constraints maintenance and view updating. It can also be combined with any integrity checking method for view updating and integrity checking. The kind of updates handled by our method are: updates of base facts, view updates, updates of deductive rules, and updates of integrity constraints. Our method is based on events and transition rules, which explicitly define the insertions and deletions induced by a knowledge base update. Using these rules, an extension of the SLDNF procedure allows us to obtain all possible minimal ways of updating a knowledge base without violating any integrity constraint. Key Words. View updating, integrity checking, integrity maintenance. 1.
Abstract Interpretation over NonDeterministic Finite Tree Automata for SetBased Analysis of Logic Programs
 In Fourth International Symposium on Practical Aspects of Declarative Languages, number 2257 in LNCS
, 2002
"... Abstract. Setbased program analysis has many potential applications, including compiler optimisations, typechecking, debugging, verification and planning. One method of setbased analysis is to solve a set of set constraints derived directly from the program text. Another approach is based on abst ..."
Abstract

Cited by 29 (10 self)
 Add to MetaCart
Abstract. Setbased program analysis has many potential applications, including compiler optimisations, typechecking, debugging, verification and planning. One method of setbased analysis is to solve a set of set constraints derived directly from the program text. Another approach is based on abstract interpretation (with widening) over an infiniteheight domain of regular types. Up till now only deterministic types have been used in abstract interpretations, whereas solving set constraints yields nondeterministic types, which are more precise. It was pointed out by Cousot and Cousot that set constraint analysis of a particular program P could be understood as an abstract interpretation over a finite domain of regular tree grammars, constructed from P. In this paper we define such an abstract interpretation for logic programs, formulated over a domain of nondeterministic finite tree automata, and describe its implementation. Both goaldependent and goalindependent analysis are considered. Variations on the abstract domains operations are introduced, and we discuss the associated tradeoffs of precision and complexity. The experimental results indicate that this approach is a practical way of achieving the precision of setconstraints in the abstract interpretation framework. 1
Scalable Authoritative OWL Reasoning for the Web
, 2009
"... In this paper we discuss the challenges of performing reasoning on large scale RDF datasets from the Web. Using terHorst's pD * fragment of OWL as a base, we compose a rulebased framework for application to web data: we argue our decisions using observations of undesirable examples taken directly ..."
Abstract

Cited by 29 (12 self)
 Add to MetaCart
In this paper we discuss the challenges of performing reasoning on large scale RDF datasets from the Web. Using terHorst's pD * fragment of OWL as a base, we compose a rulebased framework for application to web data: we argue our decisions using observations of undesirable examples taken directly from the Web. We further temper our OWL fragment through consideration of "authoritative sources" which counteracts an observed behaviour which we term "ontology hijacking": new ontologies published on the Web redefining the semantics of existing entities resident in other ontologies. We then present our system for performing rulebased forwardchaining reasoning which we call SAOR: Scalable Authoritative OWL Reasoner. Based upon observed characteristics of web data and reasoning in general, we design our system to scale: our system is based upon a separation of terminological data from assertional data and comprises of a lightweight inmemory index, ondisk sorts and filescans. We evaluate our methods on a dataset in the order of a hundred million statements collected from realworld web sources and present scaleup experiments on a dataset in the order of a billion statements collected from the Web.
D'ej`a vu in fixpoints of logic programs
 in Proceedings of the North American Conference on Logic Programming
, 1989
"... We investigate properties of logic programs that permit refinements in their fixpoint evaluation and shed light on the choice of control strategy. A fundamental aspect of a bottomup computation is that we must constantly check to see if the fixpoint has been reached. If the computation iteratively ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
We investigate properties of logic programs that permit refinements in their fixpoint evaluation and shed light on the choice of control strategy. A fundamental aspect of a bottomup computation is that we must constantly check to see if the fixpoint has been reached. If the computation iteratively applies all rules, bottomup, until the fixpoint is reached, this amounts to checking if any new facts were produced after each iteration. Such a check also enhances efficiency in that duplicate facts need not be reused in subsequent iterations, if we use the Seminaive fixpoint evaluation strategy. However, the cost of this check is a significant component of the cost of bottomup fixpoint evaluation, and for many programs the full check is unnecessary. We identify properties of programs that enable us to infer that a much simpler check (namely, whether any fact was produced in the previous iteration) suffices. While it is in general undecidable whether a given program has these properties, we develop techniques to test sufficient conditions, and we illustrate these techniques on some simple programs that have these properties. The significance of our results lies in the significantly larger class of programs for which bottomup evaluation methods, enhanced with the optimizations that we propose, become competitive with standard (topdown) implementations of logic programs. This increased efficiency is achieved without compromising the completeness of the bottomup approach; this is in contrast to the incompleteness that accompanies the depthfirst search strategy that is central to most topdown implementations.
Confluence in Concurrent Constraint Programming
 IN ALAGAR AND NIVAT, EDITORS, PROCEEDINGS OF AMAST '95, LNCS 936
, 1996
"... Concurrent constraint programming (ccp), like most of the concurrent paradigms, has a mechanism of global choice which makes computations dependent on the scheduling of processes. This is one of the main reasons why the formal semantics of ccp is more complicated than the one of its deterministic an ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Concurrent constraint programming (ccp), like most of the concurrent paradigms, has a mechanism of global choice which makes computations dependent on the scheduling of processes. This is one of the main reasons why the formal semantics of ccp is more complicated than the one of its deterministic and localchoice sublanguages. In this paper we study various subsets of ccp obtained by adding some restriction on the notion of choice, or by requiring confluency, i.e. independency from the scheduling strategy. We show that it is possible to define simple denotational semantics for these subsets, for various notions of observables. Finally, as an application of our results we develop a framework for the compositional analysis of full ccp. The basic idea is to approximate an arbitrary ccp program by a program in the restricted language, and then analyze the latter, by applying the standard techniques of abstract interpretation to its denotational semantics.
ON REPRESENTATIONAL ISSUES ABOUT COMBINATIONS OF CLASSICAL THEORIES WITH NONMONOTONIC RULES
, 2006
"... In the context of current efforts around SemanticWeb languages, the combination of classical theories in classical firstorder logic (and in particular of ontologies in various description logics) with rule languages rooted in logic programming is receiving considerable attention. Existing approach ..."
Abstract

Cited by 19 (13 self)
 Add to MetaCart
In the context of current efforts around SemanticWeb languages, the combination of classical theories in classical firstorder logic (and in particular of ontologies in various description logics) with rule languages rooted in logic programming is receiving considerable attention. Existing approaches such as SWRL, dlprograms, and DL+log, differ significantly in the way ontologies interact with (nonmonotonic) rules bases. In this paper, we identify fundamental representational issues which need to be addressed by such combinations and formulate a number of formal principles which help to characterize and classify existing and possible future approaches to the combination of rules and classical theories. We use the formal principles to explicate the underlying assumptions of current approaches. Finally, we propose a number of settings, based on our analysis of the representational issues and the fundamental principles underlying current approaches.