Results 1  10
of
45
Logic and databases: a deductive approach
 ACM Computing Surveys
, 1984
"... The purpose of this paper is to show that logic provides a convenient formalism for studying classical database problems. There are two main parts to the paper, devoted respectively to conventional databases and deductive databases. In the first part, we focus on query languages, integrity modeling ..."
Abstract

Cited by 143 (2 self)
 Add to MetaCart
The purpose of this paper is to show that logic provides a convenient formalism for studying classical database problems. There are two main parts to the paper, devoted respectively to conventional databases and deductive databases. In the first part, we focus on query languages, integrity modeling and maintenance, query optimization, and data
Tableau Techniques For Querying Information Sources Through Global Schemas
 In Proc. of the 7th Int. Conf. on Database Theory (ICDT'99), volume 1540 of Lecture Notes in Computer Science
, 1999
"... . The foundational homomorphism techniques introduced by Chandra and Merlin for testing containment of conjunctive queries have recently attracted renewed interest due to their central role in information integration applications. We show that generalizations of the classical tableau representation ..."
Abstract

Cited by 118 (6 self)
 Add to MetaCart
. The foundational homomorphism techniques introduced by Chandra and Merlin for testing containment of conjunctive queries have recently attracted renewed interest due to their central role in information integration applications. We show that generalizations of the classical tableau representation of conjunctive queries are useful for computing query answers in information integration systems where information sources are modeled as views defined on a virtual global schema. We consider a general situation where sources may or may not be known to be correct and complete. We characterize the set of answers to a global query and give algorithms to compute a finite representation of this possibly infinite set, as well as its certain and possible approximations. We show how to rewrite a global query in terms of the sources in two special cases, and show that one of these is equivalent to the Information Manifold rewriting of Levy et al. 1 Introduction Information Integration systems [Ull...
A general Datalogbased framework for tractable query answering over ontologies
 In Proc. PODS2009. ACM
, 2009
"... Ontologies play a key role in the Semantic Web [4], data modeling, and information integration [16]. Recent trends in ontological reasoning have shifted from decidability issues to tractability ones, as e.g. reflected by the work on the DLLite family of tractable description logics (DLs) [11, 19]. ..."
Abstract

Cited by 69 (18 self)
 Add to MetaCart
Ontologies play a key role in the Semantic Web [4], data modeling, and information integration [16]. Recent trends in ontological reasoning have shifted from decidability issues to tractability ones, as e.g. reflected by the work on the DLLite family of tractable description logics (DLs) [11, 19]. An important result of these works is that the main
Horn clauses and database dependencies
 Journal of the ACM
, 1982
"... Abstract. Certain firstorder sentences, called "dependencies, " about relations in a database are defined and studied. These dependencies seem to include all prewously defined dependencies as special cases A new concept is mtroduced, called "faithfulness (with respect to direct produ ..."
Abstract

Cited by 60 (6 self)
 Add to MetaCart
Abstract. Certain firstorder sentences, called "dependencies, " about relations in a database are defined and studied. These dependencies seem to include all prewously defined dependencies as special cases A new concept is mtroduced, called "faithfulness (with respect to direct product), " which enables powerful results to be proved about the existence of "Armstrong relations " in the presence of these new dependencies. (An Armstrong relaUon is a relation that obeys precisely those dependencies that are the logical consequences of a given set of dependencies.) Results are also obtained about characterizing the class of projections of those relations that obey a given set of dependencies.
Taming the infinite chase: Query answering under expressive relational constraints
 In Proc. of KR 2008
, 2008
"... The chase algorithm is a fundamental tool for query evaluation and for testing query containment under tuplegenerating dependencies (TGDs) and equalitygenerating dependencies (EGDs). So far, most of the research on this topic has focused on cases where the chase procedure terminates. This paper in ..."
Abstract

Cited by 51 (12 self)
 Add to MetaCart
The chase algorithm is a fundamental tool for query evaluation and for testing query containment under tuplegenerating dependencies (TGDs) and equalitygenerating dependencies (EGDs). So far, most of the research on this topic has focused on cases where the chase procedure terminates. This paper introduces expressive classes of TGDs defined via syntactic restrictions: guarded TGDs (GTGDs) and weakly guarded sets of TGDs (WGTGDs). For these classes, the chase procedure is not guaranteed to terminate and thus may have an infinite outcome. Nevertheless, we prove that the problems of conjunctivequery answering and query containment under such TGDs are decidable. We provide decision procedures and tight complexity bounds for these problems. Then we show how EGDs can be incorporated into our results by providing conditions under which EGDs do not harmfully interact with TGDs and do not affect the decidability and complexity of query answering. We show applications of the aforesaid classes of constraints to the problem of answering conjunctive queries in FLogic Lite, an objectoriented ontology language, and in some tractable Description Logics. 1.
Probabilistic data exchange
 In Proc. ICDT
, 2010
"... The work reported here lays the foundations of data exchange in the presence of probabilistic data. This requires rethinking the very basic concepts of traditional data exchange, such as solution, universal solution, and the certain answers of target queries. We develop a framework for data exchange ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
The work reported here lays the foundations of data exchange in the presence of probabilistic data. This requires rethinking the very basic concepts of traditional data exchange, such as solution, universal solution, and the certain answers of target queries. We develop a framework for data exchange over probabilistic databases, and make a case for its coherence and robustness. This framework applies to arbitrary schema mappings, and finite or countably infinite probability spaces on the source and target instances. After establishing this framework and formulating the key concepts, we study the application of the framework to a concrete and practical setting where probabilistic databases are compactly encoded by means of annotations formulated over random Boolean variables. In this setting, we study the problems of testing for the existence of solutions and universal solutions, materializing such solutions, and evaluating target queries (for unions of conjunctive queries) in both the exact sense and the approximate sense. For each of the problems, we carry out a complexity analysis based on properties of the annotation, in various classes of dependencies. Finally, we show that the framework and results easily and completely generalize to allow not only the data, but also the schema mapping itself to be probabilistic.
Implication Problems for Functional Constraints on Databases Supporting Complex Objects
 Journal of Computer and System Sciences
, 1995
"... Virtually all semantic or objectoriented data models assume objects have an identity separate from any of their parts, and allow users to define complex object types in which part values may be any other objects. In [20], a more general form of functional dependency is proposed for such models in w ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
Virtually all semantic or objectoriented data models assume objects have an identity separate from any of their parts, and allow users to define complex object types in which part values may be any other objects. In [20], a more general form of functional dependency is proposed for such models in which component attributes may correspond to descriptions of property paths, called path functional dependencies (PFDs). The main contribution of the reference is a sound and complete axiomatization for PFDs when databases may be infinite. However, a number of issues were left open which are resolved in this paper. We first prove that the same axiomatization remains complete if PFDs are permitted empty lefthand sides, but that this is not true if logical consequence is defined with respect to finite databases. We then prove that the implication problem for arbitrary PFDs is decidable. The proof suggests a means of characterizing an important function closure which is then used to derive an effective procedure for constructing a deterministic finite state automation representing the closure. The procedure is further refined to efficient polynomial time algorithms for the implication problem for cases in which antecedent PFDs are a form of complex key constraint. Index Terms: constraints, functional dependencies, objectoriented data models, complex objects, implication problems
Rewriting ontological queries into small nonrecursive datalog programs
"... We consider the setting of ontological database access, where an Abox is given in form of a relational database D and where a Boolean conjunctive query q has to be evaluated against D modulo a Tbox Σ formulated in DLLite or Linear Datalog ±. It is wellknown that (Σ, q) can be rewritten into an ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
We consider the setting of ontological database access, where an Abox is given in form of a relational database D and where a Boolean conjunctive query q has to be evaluated against D modulo a Tbox Σ formulated in DLLite or Linear Datalog ±. It is wellknown that (Σ, q) can be rewritten into an equivalent nonrecursive Datalog program P that can be directly evaluated over D. However, for Linear Datalog ± or for DLLite versions that allow for role inclusion, the rewriting methods described so far result in a nonrecursive Datalog program P of size exponential in the joint size of Σ and q. This gives rise to the interesting question of whether such a rewriting necessarily needs to be of exponential size. In this paper we show that it is actually possible to translate (Σ, q) into a polynomially sized equivalent nonrecursive Datalog program P.
Walking the Complexity Lines for Generalized Guarded Existential Rules
 PROCEEDINGS OF THE TWENTYSECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE
"... We establish complexities of the conjunctive query entailment problem for classes of existential rules (i.e. TupleGenerating Dependencies or Datalog+/rules). Our contribution is twofold. First, we introduce the class of greedy bounded treewidth sets (gbts), which covers guarded rules, and their kno ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
We establish complexities of the conjunctive query entailment problem for classes of existential rules (i.e. TupleGenerating Dependencies or Datalog+/rules). Our contribution is twofold. First, we introduce the class of greedy bounded treewidth sets (gbts), which covers guarded rules, and their known generalizations, namely (weakly) frontierguarded rules. We provide a generic algorithm for query entailment with gbts, which is worstcase optimal for combined complexity with bounded predicate arity, as well as for data complexity. Second, we classify several gbts classes, whose complexity was unknown, namely frontierone, frontierguarded and weakly frontierguarded rules, with respect to combined complexity (with bounded and unbounded predicate arity) and data complexity.
T.: Datalog ± : a unified approach to ontologies and integrity constraints
 In: Proceedings of the 12th International Conference on Database Theory
, 2009
"... We report on a recently introduced family of expressive extensions of Datalog, called Datalog ± , which is a new framework for representing ontological axioms in form of integrity constraints, and for query answering under such constraints. Datalog ± is derived from Datalog by allowing existentially ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
We report on a recently introduced family of expressive extensions of Datalog, called Datalog ± , which is a new framework for representing ontological axioms in form of integrity constraints, and for query answering under such constraints. Datalog ± is derived from Datalog by allowing existentially quantified variables in rule heads, and by enforcing suitable properties in rule bodies, to ensure decidable and efficient query answering. We first present different languages in the Datalog ± family, providing tight complexity bounds for all cases but one (where we have a low complexity ac0 upper bound). We then show that such languages are general enough to capture the most common tractable ontology languages. In particular, we show that the DLLite family of description logics and FLogic Lite are expressible in Datalog ±. We finally show how stratified negation can be added to Datalog ± while keeping ontology querying tractable in the data complexity. Datalog ± is a natural and very general framework that can be successfully employed in different contexts such as data integration and exchange. This survey mainly summarizes two recent papers. Categories and Subject Descriptors