Results 1  10
of
28
Abduction in Logic Programming
"... Abduction in Logic Programming started in the late 80s, early 90s, in an attempt to extend logic programming into a framework suitable for a variety of problems in Artificial Intelligence and other areas of Computer Science. This paper aims to chart out the main developments of the field over th ..."
Abstract

Cited by 538 (73 self)
 Add to MetaCart
Abduction in Logic Programming started in the late 80s, early 90s, in an attempt to extend logic programming into a framework suitable for a variety of problems in Artificial Intelligence and other areas of Computer Science. This paper aims to chart out the main developments of the field over the last ten years and to take a critical view of these developments from several perspectives: logical, epistemological, computational and suitability to application. The paper attempts to expose some of the challenges and prospects for the further development of the field.
Safety and translation of relational calculus queries
 ACM Transactions on Database Systems
, 1991
"... Notallqueries inrelational calculus can beanswered sensibly when disjunction, negation, and universal quantification are allowed, The class of relation calculus queries or formulas that have sensible answers is called the domam independent class which is known to be undecidable. Subsequent research ..."
Abstract

Cited by 59 (0 self)
 Add to MetaCart
Notallqueries inrelational calculus can beanswered sensibly when disjunction, negation, and universal quantification are allowed, The class of relation calculus queries or formulas that have sensible answers is called the domam independent class which is known to be undecidable. Subsequent research has focused on identifying large decidable subclasses of domain independent formulas. In this paper we investigate the properties of two such classes: the et,aluable formulas and the allowed formulas. Although both classes have been defined before, we give simplified definitions, present short proofs of their main properties, and describe a method to incorporate equality. Although evaluable queries have sensible answers, it is not straightforward to compute them efficiently or correctly, We introduce relational algebra normal form for formulas from which form the correct translation into relational algebra istrivlal. We give algorithms to transform anevaluable formula into an equivalent allowed formula and from there into relational algebra normal form, Our algorithms avoid use of the socalled Dom relation, consisting of all constants appearing in the database or the query. Finally, we describe a restriction under which every domain independent formula is evaluable
Logic and Databases: a 20 Year Retrospective
, 1996
"... . At a workshop held in Toulouse, France in 1977, Gallaire, Minker and Nicolas stated that logic and databases was a field in its own right (see [131]). This was the first time that this designation was made. The impetus for this started approximately twenty years ago in 1976 when I visited Gallaire ..."
Abstract

Cited by 55 (1 self)
 Add to MetaCart
. At a workshop held in Toulouse, France in 1977, Gallaire, Minker and Nicolas stated that logic and databases was a field in its own right (see [131]). This was the first time that this designation was made. The impetus for this started approximately twenty years ago in 1976 when I visited Gallaire and Nicolas in Toulouse, France, which culminated in a workshop held in Toulouse, France in 1977. It is appropriate, then to provide an assessment as to what has been achieved in the twenty years since the field started as a distinct discipline. In this retrospective I shall review developments that have taken place in the field, assess the contributions that have been made, consider the status of implementations of deductive databases and discuss the future of work in this area. 1 Introduction As described in [234], the use of logic and deduction in databases started in the late 1960s. Prominent among the developments was the work by Levien and Maron [202, 203, 199, 200, 201] and Kuhns [1...
Conceptual Modelling of Database Applications Using an Extended ER Model
, 1992
"... In this paper, we motivate and present a data model for conceptual design of structural and behavioural aspects of databases. We follow an object centered design paradigm in the spirit of semantic data models. The specification of structural aspects is divided into modelling of object structures and ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
In this paper, we motivate and present a data model for conceptual design of structural and behavioural aspects of databases. We follow an object centered design paradigm in the spirit of semantic data models. The specification of structural aspects is divided into modelling of object structures and modelling of data types used for describing object properties. The specification of object structures is based on an Extended EntityRelationship (EER) model. The specification of behavioural aspects is divided into the modelling of admissible database state evolutions by means of temporal integrity constraints and the formulation of database (trans)actions. The central link for integrating these design components is a descriptive logic based query language for the EER model. The logic part of this language is the basis for static constraints and descriptive action specifications by means of pre and postconditions. A temporal extension of this logic is the specification language for tem...
Query Answering in Information Systems with Integrity Constraints
, 1997
"... The specifications of most of the nowadays ubiquitous informations systems include integrity constraints, i.e. conditions rejecting socalled "invalid" or "inconsistent " data. Information system consistency and query answering have been formalized referring to classical logic implicitly assuming th ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
The specifications of most of the nowadays ubiquitous informations systems include integrity constraints, i.e. conditions rejecting socalled "invalid" or "inconsistent " data. Information system consistency and query answering have been formalized referring to classical logic implicitly assuming that query answering only makes sense with consistent information systems. In practice, however, inconsistent as well as consistent information systems need to be queried. In this paper, it is first argued that classical logic is inappropriate for a formalization of information systems because of its global notion of inconsistency. It is claimed that information systems inconsistency should be understood as a local notion. Then, it is shown that minimal logic, a constructivistic weakening of classical logic which precludes refutation proofs, provides for local inconsistencies that conveniently reflect a practitioner's intuition. Further, minimal logic is shown to be a convenient foundation fo...
From Relational to ObjectOriented Integrity Simplification
, 1991
"... 1 Relational integrity checking technology can be transfered to deductive object bases by utilizing a simple logical framework for objects. The principles of object identity, aggregation and classification allow a more efficient constraint control by finer granularity of updates, composite updat ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
1 Relational integrity checking technology can be transfered to deductive object bases by utilizing a simple logical framework for objects. The principles of object identity, aggregation and classification allow a more efficient constraint control by finer granularity of updates, composite updates and semantic constraint simplification. In many cases, metalevel constraints and deductive rules can be handled efficiently by a stepwise compilation approach. An extended integrity subsystem with these features has been implemented in the deductive object base ConceptBase. 1 This work was supported in part by the Commission of the European Community under ESPRIT Basic Research Action 3012 (CompuLog). A version of this paper will also appear in the Proc. Second Int. Conf. on Deductive and ObjectOriented Databases, Munich, Dec. 1991 1. Introduction Comprehensive and efficient integrity maintenance has been quoted as one of the major problems in nextgeneration databases. Systems l...
Integrity Checking in Deductive Databases
 In Proceedings of the VLDB International Conference
, 1987
"... We describe the theory and implementation of a general theoremproving technique for checking integrity of deductive databases recently proposed by Sadri and Kowalski. The method uses an extension of the SLDNF proof procedure and achieves the effect of the simplification algorithms of Nicolas, Lloyd ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
We describe the theory and implementation of a general theoremproving technique for checking integrity of deductive databases recently proposed by Sadri and Kowalski. The method uses an extension of the SLDNF proof procedure and achieves the effect of the simplification algorithms of Nicolas, Lloyd, Topor et al, and Decker by reasoning forwards from the update and thus focusing on the relevant parts of the database and the relevant constraints.
Integrity Constraint and Rule Maintenance in Temporal Deductive Knowledge Bases
, 1993
"... The enforcement of semantic integrity constraints in data and knowledge bases constitutea a major performance bottleneck. Integrity constraint simplification methods aim at reducing the complexity of formula evaluation at runtime. This paper proposes such a simplification method for large and seman ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
The enforcement of semantic integrity constraints in data and knowledge bases constitutea a major performance bottleneck. Integrity constraint simplification methods aim at reducing the complexity of formula evaluation at runtime. This paper proposes such a simplification method for large and semantically rich knowledge bases. Structural, temporal and assertional knowledge in the form of deductive rules and integrity constraints, is represented in Telos, a hybrid language for knowledge representation. A compilation method performs a number of syntactic, semantic and temporal transformations to integrity constraints and deductive rules, and organizes simplified forms in a dependence graph that allows for efficient computati.on of implicit updates. Precomputation of potential implicit updates at compile time is possible by computing the dependence graph transitive closure. To account for dynamic changes to the dependence graph by updates of constraints and rules, we propose efficient algorithms for the incremental maintenance of the computed transitive closure.
Partial Deduction of the Ground Representation and its Application to Integrity Checking
 Proceedings of ILPS'95, the International Logic Programming Symposium
, 1995
"... Integrity constraints are very useful in many contexts, such as, for example, deductive databases, abductive and inductive logic programming. However, fully testing the integrity constraints after each update or modification can be very expensive and methods have been developed which simplify the in ..."
Abstract

Cited by 19 (12 self)
 Add to MetaCart
Integrity constraints are very useful in many contexts, such as, for example, deductive databases, abductive and inductive logic programming. However, fully testing the integrity constraints after each update or modification can be very expensive and methods have been developed which simplify the integrity constraints. In this paper, we pursue the goal of writing this simplification procedure as a metaprogram in logic programming and then using partial deduction to obtain precompiled integrity checks for certain update patterns. We argue that the ground representation has to be used to write this metaprogram declaratively. We however also show that, contrary to what one might expect, current partial deduction techniques are then unable to specialise this metainterpreter in an interesting way and no precompilation of integrity checks can be obtained. In fact, we show that partial deduction (alone) is not able to perform any (sophisticated) specialisation at the objectlevel for meta...