Results 1  10
of
51
Maintenance of Materialized Views: Problems, Techniques, and Applications
, 1995
"... In this paper we motivate and describe materialized views, their applications, and the problems and techniques for their maintenance. We present a taxonomy of view maintenanceproblems basedupon the class of views considered, upon the resources used to maintain the view, upon the types of modi#cati ..."
Abstract

Cited by 281 (9 self)
 Add to MetaCart
In this paper we motivate and describe materialized views, their applications, and the problems and techniques for their maintenance. We present a taxonomy of view maintenanceproblems basedupon the class of views considered, upon the resources used to maintain the view, upon the types of modi#cations to the base data that areconsidered during maintenance, and whether the technique works for all instances of databases and modi#cations. We describe some of the view maintenancetechniques proposed in the literature in terms of our taxonomy. Finally, we consider new and promising application domains that are likely to drive work in materialized views and view maintenance. 1 Introduction What is a view? A view is a derived relation de#ned in terms of base #stored# relations. A view thus de#nes a function from a set of base tables to a derived table; this function is typically recomputed every time the view is referenced. What is a materialized view? A view can be materialized by storin...
Logic Programming and Negation: A Survey
 JOURNAL OF LOGIC PROGRAMMING
, 1994
"... We survey here various approaches which were proposed to incorporate negation in logic programs. We concentrate on the prooftheoretic and modeltheoretic issues and the relationships between them. ..."
Abstract

Cited by 243 (8 self)
 Add to MetaCart
We survey here various approaches which were proposed to incorporate negation in logic programs. We concentrate on the prooftheoretic and modeltheoretic issues and the relationships between them.
DynFO: A Parallel, Dynamic Complexity Class
 Journal of Computer and System Sciences
, 1994
"... Traditionally, computational complexity has considered only static problems. Classical Complexity Classes such as NC, P, and NP are defined in terms of the complexity of checking  upon presentation of an entire input  whether the input satisfies a certain property. For many applications of compu ..."
Abstract

Cited by 49 (4 self)
 Add to MetaCart
Traditionally, computational complexity has considered only static problems. Classical Complexity Classes such as NC, P, and NP are defined in terms of the complexity of checking  upon presentation of an entire input  whether the input satisfies a certain property. For many applications of computers it is more appropriate to model the process as a dynamic one. There is a fairly large object being worked on over a period of time. The object is repeatedly modified by users and computations are performed. We develop a theory of Dynamic Complexity. We study the new complexity class, Dynamic FirstOrder Logic (DynFO). This is the set of properties that can be maintained and queried in firstorder logic, i.e. relational calculus, on a relational database. We show that many interesting properties are in DynFO including multiplication, graph connectivity, bipartiteness, and the computation of minimum spanning trees. Note that none of these problems is in static FO, and this f...
FirstOrder Incremental Evaluation of Datalog Queries
 Annals of Mathematics and Artificial Intelligence
, 1993
"... We consider the problem of repeatedly evaluating the same (computationally expensive) query to a database that is being updated between successive query requests. In this situation, it should be possible to use the difference between successive database states and the answer to the query in one stat ..."
Abstract

Cited by 49 (17 self)
 Add to MetaCart
We consider the problem of repeatedly evaluating the same (computationally expensive) query to a database that is being updated between successive query requests. In this situation, it should be possible to use the difference between successive database states and the answer to the query in one state to reduce the cost of evaluating the query in the next state. We use firstorder queries to compute the differences, and call this process "firstorder incremental query evaluation." After formalizing the notion of firstorder incremental query evaluation, we give an algorithm that constructs, for each regular chain query (including transitive closure as a special case), a nonrecursive program to compute the difference between the answer after an update and the answer before the update. We then extend this result to weakly regular queries, which are regular chain programs augmented with conjunctive queries having the socalled cartesianclosed increment property, and to the case of unbound...
Local Verification of Global Integrity Constraints in Distributed Databases
 In Proceedings of the ACM SIGMOD International Conference on Management of Data
, 1993
"... We present an optimization for integrity constraint verification in distributed databases. The optimization allows a global constraint, i.e. a constraint spanning multiple databases, to be verified by accessing data at a single database, eliminating the cost of accessing remote data. The optimizatio ..."
Abstract

Cited by 47 (7 self)
 Add to MetaCart
We present an optimization for integrity constraint verification in distributed databases. The optimization allows a global constraint, i.e. a constraint spanning multiple databases, to be verified by accessing data at a single database, eliminating the cost of accessing remote data. The optimization is based on an algorithm that takes as input a global constraint and data to be inserted into a local database. The algorithm produces a local condition such that if the local data satisfies this condition then, based on the previous satisfaction of the global constraint, the global constraint is still satisfied. If the local data does not satisfy the condition, then a conventional global verification procedure is required. 1 Introduction A clear trend in information systems technology is the distribution of related data across multiple sites. Such systems may vary from tightlycoupled parallel databases to federated information systems. In all cases, one benefit of data distribution is t...
Query Answering in Information Systems with Integrity Constraints
, 1997
"... The specifications of most of the nowadays ubiquitous informations systems include integrity constraints, i.e. conditions rejecting socalled "invalid" or "inconsistent " data. Information system consistency and query answering have been formalized referring to classical logic im ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
The specifications of most of the nowadays ubiquitous informations systems include integrity constraints, i.e. conditions rejecting socalled "invalid" or "inconsistent " data. Information system consistency and query answering have been formalized referring to classical logic implicitly assuming that query answering only makes sense with consistent information systems. In practice, however, inconsistent as well as consistent information systems need to be queried. In this paper, it is first argued that classical logic is inappropriate for a formalization of information systems because of its global notion of inconsistency. It is claimed that information systems inconsistency should be understood as a local notion. Then, it is shown that minimal logic, a constructivistic weakening of classical logic which precludes refutation proofs, provides for local inconsistencies that conveniently reflect a practitioner's intuition. Further, minimal logic is shown to be a convenient foundation fo...
From Relational to ObjectOriented Integrity Simplification
, 1991
"... 1 Relational integrity checking technology can be transfered to deductive object bases by utilizing a simple logical framework for objects. The principles of object identity, aggregation and classification allow a more efficient constraint control by finer granularity of updates, composite updat ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
1 Relational integrity checking technology can be transfered to deductive object bases by utilizing a simple logical framework for objects. The principles of object identity, aggregation and classification allow a more efficient constraint control by finer granularity of updates, composite updates and semantic constraint simplification. In many cases, metalevel constraints and deductive rules can be handled efficiently by a stepwise compilation approach. An extended integrity subsystem with these features has been implemented in the deductive object base ConceptBase. 1 This work was supported in part by the Commission of the European Community under ESPRIT Basic Research Action 3012 (CompuLog). A version of this paper will also appear in the Proc. Second Int. Conf. on Deductive and ObjectOriented Databases, Munich, Dec. 1991 1. Introduction Comprehensive and efficient integrity maintenance has been quoted as one of the major problems in nextgeneration databases. Systems l...
Integrity verification in knowledge bases
 Logic Programming. Proceedings of the First and Second Russian Conference on Logic Programming, LNCS 592
"... ABSTlZACT In order to faithfully describe reallife applications, knowledge bases have to manage general integrity constraints. In this article, we analyse methods for an efficient verification of integrity constraints in updated knowledge bases. These methods rely on the satisfaction of the integri ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
ABSTlZACT In order to faithfully describe reallife applications, knowledge bases have to manage general integrity constraints. In this article, we analyse methods for an efficient verification of integrity constraints in updated knowledge bases. These methods rely on the satisfaction of the integrity constraints before the update for simplifying their evaluation in the updated knowledge base. During the last few years, an increasing amount of publications has been devoted to various aspects of this problem. Since they use distinct formalisms and different terminologies, they are di~cult to compare. Moreover, it is often complex to recognize commonalities and to find out whether techniques described in different articles are in principle different. A first part of this report aims at giving a comprehensive stateoftheart in integrity verification. It describes integrity constraint verification techniques in a common formalism. A second part of this report is devoted to comparing several proposals. The differences and similarities between various methods are investigated. 1
Nonrecursive Incremental Evaluation of Datalog Queries
 Annals of Mathematics and Artificial Intelligence
, 1995
"... We consider the problem of repeatedly evaluating the same (computationally expensive) query to a database that is being updated between successive query requests. In this situation, it should be possible to use the difference between successive database states and the answer to the query in one stat ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
We consider the problem of repeatedly evaluating the same (computationally expensive) query to a database that is being updated between successive query requests. In this situation, it should be possible to use the difference between successive database states and the answer to the query in one state to reduce the cost of evaluating the query in the next state. We use nonrecursive Datalog (which are unions of conjunctive queries) to compute the differences, and call this process "incremental query evaluation using conjunctive queries." After formalizing the notion of incremental query evaluation using conjunctive queries, we give an algorithm that constructs, for each regular chain query (including transitive closure as a special case), a nonrecursive Datalog program to compute the difference between the answer after an update and the answer before the update. We then extend this result to weakly regular queries, which are regular chain programs augmented with conjunctive queries havin...
Simplification of database integrity constraints revisited: A transformational approach
 Fundamenta Informaticae
, 2006
"... Abstract. Complete checks of database integrity constraints may be prohibitively time consuming, and several methods have been suggested for producing simplified checks for each update. The present approach introduces a set of transformation operators that apply to database integrity constraints wit ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
Abstract. Complete checks of database integrity constraints may be prohibitively time consuming, and several methods have been suggested for producing simplified checks for each update. The present approach introduces a set of transformation operators that apply to database integrity constraints with each operator representing a concise, semanticspreserving operation. These operators are applied in a procedure producing simplified constraints for parametric transaction patterns, which then can be instantiated and checked for consistency at runtime but before any transaction is executed. The operators provide a flexibility for other database enhancements and the work may also be seen as more systematic and general when compared with other approaches. The framework is formulated with firstorder clause logic but with the perspective of being applied with presentday database technology. 1