Results 1  10
of
28
Data Exchange: Semantics and Query Answering
 In ICDT
, 2003
"... Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema that reflects the source data as accurately as possible. In this paper, we address foundational and algorithmic issues related to the semantics of data exchange and to query answe ..."
Abstract

Cited by 328 (34 self)
 Add to MetaCart
Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema that reflects the source data as accurately as possible. In this paper, we address foundational and algorithmic issues related to the semantics of data exchange and to query answering in the context of data exchange. These issues arise because, given a source instance, there may be many target instances that satisfy the constraints of the data exchange problem. We give an algebraic specification that selects, among all solutions to the data exchange problem, a special class of solutions that we call universal. A universal solution has no more and no less data than required for data exchange and it represents the entire space of possible solutions. We then identify fairly general, and practical, conditions that guarantee the existence of a universal solution and yield algorithms to compute a canonical universal solution efficiently. We adopt the notion of "certain answers" in indefinite databases for the semantics for query answering in data exchange. We investigate the computational complexity of computing the certain answers in this context and also study the problem of computing the certain answers of target queries by simply evaluating them on a canonical universal solution.
Algebraic Approaches to Nondeterminism  an Overview
 ACM Computing Surveys
, 1997
"... this paper was published as Walicki, M.A. and Meldal, S., 1995, Nondeterministic Operators in Algebraic Frameworks, Tehnical Report No. CSLTR95664, Stanford University ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
this paper was published as Walicki, M.A. and Meldal, S., 1995, Nondeterministic Operators in Algebraic Frameworks, Tehnical Report No. CSLTR95664, Stanford University
Exploiting Data Dependencies in ManyValued Logics
 Journal of Applied NonClassical Logics
, 1996
"... . The purpose of this paper is to make some practically relevant results in automated theorem proving available to manyvalued logics with suitable modifications. We are working with a notion of manyvalued firstorder clauses which any finitelyvalued logic formula can be translated into and that h ..."
Abstract

Cited by 21 (7 self)
 Add to MetaCart
. The purpose of this paper is to make some practically relevant results in automated theorem proving available to manyvalued logics with suitable modifications. We are working with a notion of manyvalued firstorder clauses which any finitelyvalued logic formula can be translated into and that has been used several times in the literature, but in an ad hoc way. We give a manyvalued version of polarity which in turn leads to natural manyvalued counterparts of Horn formulas, hyperresolution, and a DavisPutnam procedure. We show that the manyvalued generalizations share many of the desirable properties of the classical versions. Our results justify and generalize several earlier results on theorem proving in manyvalued logics. KEYWORDS: manyvalued logic, polarity, Horn formula, direct products of structures, resolution, DavisPutnam procedure Introduction The purpose of this paper is to make some practically relevant results in automated theorem proving available to manyvalue...
Unique complements and decompositions of database schemata
 Journal of Computer and System Sciences
, 1994
"... In earlier work, Bancilhon and Spyratos introduced the concept of a complement to a database schema, and showed how this notion could be used in theories of decomposition and update semantics. However, they also showed that, except in trivial cases, even minimal complements are never unique, so that ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
In earlier work, Bancilhon and Spyratos introduced the concept of a complement to a database schema, and showed how this notion could be used in theories of decomposition and update semantics. However, they also showed that, except in trivial cases, even minimal complements are never unique, so that many desirable results, such as canonical decompositions, cannot be realized. Their work dealt with database schemata which are sets and database mappings which are functions, without further structure. In this work, we show that by adding a modest amount of additional structure, many important uniqueness results may be obtained. Specifically, we work with database schemata whose legal states form partially ordered sets (posets) with least elements, and with database mappings which are isotonic and which preserve this least element. This is a natural algebraic structure which is inherent in many important examples, including relational schemata constrained by data dependencies, with views constructed by composition of projection, restriction, and selection. Other examples include deductive database schemata in which views are defined by rules, and general firstorder logic databases.
Logic Programming, Functional Programming, and Inductive Definitions
 In Extensions of Logic Programming, volume 475 of LNCS
, 1991
"... Machine. It is incomplete due to depthfirst search, but presumably there could be a version using iterative deepening. An ORparallel machine such as DelPhi [12] could support such languages in future. Functions make explicit the granularity for ORparallelism: evaluation is deterministic while sea ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Machine. It is incomplete due to depthfirst search, but presumably there could be a version using iterative deepening. An ORparallel machine such as DelPhi [12] could support such languages in future. Functions make explicit the granularity for ORparallelism: evaluation is deterministic while search is not.
Elimination of Negation in a Logical Framework
, 2000
"... Logical frameworks with a logic programming interpretation such as hereditary Harrop formulae (HHF) [15] cannot express directly negative information, although negation is a useful specification tool. Since negationasfailure does not fit well in a logical framework, especially one endowed with ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Logical frameworks with a logic programming interpretation such as hereditary Harrop formulae (HHF) [15] cannot express directly negative information, although negation is a useful specification tool. Since negationasfailure does not fit well in a logical framework, especially one endowed with hypothetical and parametric judgements, we adapt the idea of elimination of negation introduced in [21] for Horn logic to a fragment of higherorder HHF. This entails finding a middle ground between the Closed World Assumption usually associated with negation and the Open World Assumption typical of logical frameworks; the main technical idea is to isolate a set of programs where static and dynamic clauses do not overlap.
The Complexity Of Querying Indefinite Information: Defined Relations, Recursion And Linear Order
, 1992
"... This dissertation studies the computational complexity of answering queries in logical databases containing indefinite information arising from two sources: facts stated in terms of defined relations, and incomplete information about linearly ordered domains. First, we consider databases consisting ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
This dissertation studies the computational complexity of answering queries in logical databases containing indefinite information arising from two sources: facts stated in terms of defined relations, and incomplete information about linearly ordered domains. First, we consider databases consisting of (1) a DATALOG program and (2) a description of the world in terms of the predicates defined by the program as well as the basic predicates. The query processing problem in such databases is related to issues in database theory, including view updates and DATALOG optimization, and also to the Artificial Intelligence problems of reasoning in circumscribed theories and sceptical abductive reasoning. If the program is nonrecursive, the meaning of the database can be represented by Clark's Predicate Completion,...
The Institution of Multialgebras  a general framework for algebraic software development
, 2002
"... this technicality ..."
Horn extended feature structures: Fast unification with negation and limited disjunction
 In Fifth Conference of the EACL
, 1991
"... The notion of a Horn extended feature structure (Hoxf) is introduced, which is a feature structure constrained so that its only allowable extensions are those satisfying some set of Horn clauses in featureterm logic. Hoxf’s greatly generalize ordinary feature structures in admitting explicit repres ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
The notion of a Horn extended feature structure (Hoxf) is introduced, which is a feature structure constrained so that its only allowable extensions are those satisfying some set of Horn clauses in featureterm logic. Hoxf’s greatly generalize ordinary feature structures in admitting explicit representation of negative and implicational constraints. In contradistinction to the general case in which arbitrary logical constraints are allowed (for which the best known algorithms are exponential), there is a highly tractable algorithm for the unification of Hoxf’s. † The research reported herein was performed while the author was visiting the COSMOS Computational
Computational and structural aspects of openly specified type hierarchies
 Logical Aspects of Computational Linguistics, Third International Conference, LACL '98
, 1998
"... Abstract. One may identify two main approaches to the description of type hierarchies. In total specification, a unique hierarchy is described. In open specification, a set of constraints identifies properties of the hierarchy, without providing a complete description. Open specification provides in ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract. One may identify two main approaches to the description of type hierarchies. In total specification, a unique hierarchy is described. In open specification, a set of constraints identifies properties of the hierarchy, without providing a complete description. Open specification provides increased expressive power, but at the expense of considerable computational complexity, with essential tasks being NPcomplete or NPhard. In this work, a formal study of the structural and computational aspects of open specification is conducted, so that a better understanding of how techniques may be developed to address these complexities. In addition, a technique is presented, based upon Horn clauses, which allows one to obtain answers to certain types of queries on open specifications very efficiently. 1.