Results 1  10
of
13
On the Implication Problem for Probabilistic Conditional Independency
, 2000
"... The implication problem is to test whether a given set of independencies logically implies another independency. This problem is crucial in the design of a probabilistic reasoning system. We advocate that Bayesian networks are a generalization of standard relational databases. On the contrary, it ha ..."
Abstract

Cited by 35 (30 self)
 Add to MetaCart
The implication problem is to test whether a given set of independencies logically implies another independency. This problem is crucial in the design of a probabilistic reasoning system. We advocate that Bayesian networks are a generalization of standard relational databases. On the contrary, it has been suggested that Bayesian networks are different from the relational databases because the implication problem of these two systems does not coincide for some classes of probabilistic independencies. This remark, however, does not take into consideration one important issue, namely, the solvability of the implication problem.
Achievements of relational database schema design theory revisited
 Semantics in Databases, volume LCNS 1358
, 1998
"... Database schema design is seen as to decide on formats for timevarying instances, on rules for supporting inferences and on semantic constraints. Schema design aims at both faithful formalization of the application and optimization at design time. It is guided by four heuristics: Separation of Asp ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
Database schema design is seen as to decide on formats for timevarying instances, on rules for supporting inferences and on semantic constraints. Schema design aims at both faithful formalization of the application and optimization at design time. It is guided by four heuristics: Separation of Aspects, Separation of Specializations, Inferential Completeness and Unique Flavor. A theory of schema design is to investigate these heuristics and to provide insight into how syntactic properties of schemas are related to worthwhile semantic properties, how desirable syntactic properties can be decided or achieved algorithmically, and how the syntactic properties determine costs of storage, queries and updates. Some wellknown achievements of design theory for relational databases are reviewed: normal forms, view support, deciding implications of semantic constraints, acyclicity, design algorithms removing forbidden substructures.
Quantifier elimination for statistical problems
 In Proceedings of the Fifteenth Conference on Uncertainty in Artificial Intelligence (UAI99
, 1999
"... Recent improvements on Tarski's procedure for quantifier elimination in the first order theory of real numbers makes it feasible to solve small instances of the following problems completely automatically: 1. listing all equality and inequality constraints implied by a graphical model with hidden va ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Recent improvements on Tarski's procedure for quantifier elimination in the first order theory of real numbers makes it feasible to solve small instances of the following problems completely automatically: 1. listing all equality and inequality constraints implied by a graphical model with hidden variables. 2. Comparing graphical models with hidden variables (i.e., model equivalence, inclusion, and overlap). 3. Answering questions about the identification of a model or portion of a model, and about bounds on quantities derived from a model. 4. Determining whether an independence assertion is implied from a given set of independence assertions. We discuss the foundations of quantifier elimination and demonstrate its application to these problems. 1
Databases and FiniteModel Theory
 IN DESCRIPTIVE COMPLEXITY AND FINITE MODELS
, 1997
"... Databases provide one of the main concrete scenarios for finitemodel theory within computer science. This paper presents an informal overview of database theory aimed at finitemodel theorists, emphasizing the specificity of the database area. It is argued that the area of databases is a rich sourc ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Databases provide one of the main concrete scenarios for finitemodel theory within computer science. This paper presents an informal overview of database theory aimed at finitemodel theorists, emphasizing the specificity of the database area. It is argued that the area of databases is a rich source of questions and vitality for finitemodel theory.
The Relational Database Theory of Bayesian Networks
, 2000
"... Based on the elegant theory of relational databases, the present investigation establishes a unified model for both relational databases and Bayesian networks. This is in contradiction to the argument that relational databases and Bayesian networks are different, where it was shown that the implicat ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Based on the elegant theory of relational databases, the present investigation establishes a unified model for both relational databases and Bayesian networks. This is in contradiction to the argument that relational databases and Bayesian networks are different, where it was shown that the implication problem does not coincide for embedded multivalued dependency (EMVD) and probabilistic conditional independence (CI). The main result of this thesis, however, is that the implication problem coincides on the solvable subclasses of EMVD and CI, but differs on the unsolvable general classes of EMVD and CI. This means that there is no practical difference between relational databases and Bayesian networks, since only the solvable subclasses are useful in the design of both of these knowledge systems.
Translation Schemes and the Fundamental Problem of Database Design
, 1996
"... . We introduce a new point of view into database schemes by applying systematically an old logical technique: translation schemes, and their induced formula and structure transformations. This allows us to reexamine the notion of dependency preserving decomposition and its generalization refinemen ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
. We introduce a new point of view into database schemes by applying systematically an old logical technique: translation schemes, and their induced formula and structure transformations. This allows us to reexamine the notion of dependency preserving decomposition and its generalization refinement. The most important aspect of this approach lies in laying the groundwork for a formulation of the Fundamental Problem of Database Design, namely to exhibit desirable differences between translationequivalent presentations of data and to examine refinements of data presentations in a systematic way. The emphasis in this paper is not on results. The main line of thought is an exploration of the use of an old logical tool in addressing the Fundamental Problem. Translation schemes allow us to have a new look at normal forms of database schemes and to suggest a new line of search for other normal forms. Furthermore we give a characterization of the embedded implicational dependencies (EID'...
A TopDown Proof Procedure for Generalized Data Dependencies
"... . Data dependencies are well known in the context of relational database. They aim to specify constraints that the data must satisfy to model correctly the part of the world under consideration. The implication problem for dependencies is to decide whether a given dependency is logically implied by ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. Data dependencies are well known in the context of relational database. They aim to specify constraints that the data must satisfy to model correctly the part of the world under consideration. The implication problem for dependencies is to decide whether a given dependency is logically implied by a given set of dependencies. A proof procedure for the implication problem, called \chase", has already been studied in the generalized case of tuplegenerating and equalitygenerating dependencies. The chase is a bottomup procedure: from hypotheses to conclusion, and thus is not goaldirected. It also requires the dynamic creation of new symbols, which can turn out to be a costly operation. This paper introduces a new proof procedure which is topdown: from conclusion to hypothesis, that is goaldirected. The originality of this procedure is that it does not act as classical theorem proving procedures, which require a special form of expressions, such as clausal form, obtained after skolemi...
Dependency Preserving Refinements and the Fundamental Problem of Database Design
, 1998
"... . We introduce a new point of view into database schemes by applying systematically an old logical technique: translation schemes, and their induced formula and structure transformations. This allows us to reexamine the notion of dependency preserving decomposition and its generalization refinemen ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. We introduce a new point of view into database schemes by applying systematically an old logical technique: translation schemes, and their induced formula and structure transformations. This allows us to reexamine the notion of dependency preserving decomposition and its generalization refinement. We demonstrate the usefulness of this approach by recasting the theory of vertical and horizontal decompositions in our terminology. The most important aspect of this approach, however, lies in laying the groundwork for a formulation of the Fundamental Problem of Database Design, namely to exhibit desirable differences between translation equivalent presentations of data and to examine refinements of data presentations in a systematic way. The emphasis in this paper is not on results. The main line of thought is an exploration of the use of an old logical tool in addressing the Fundamental Problem. Translation schemes allow us also to have a new look at normal forms of database schemes an...
226 Quantifier Elimination for Statistical Problems
"... Recent improvements on Tarski's procedure for quantifier elimination in the first order theory of real numbers makes it feasible to solve small instances of the following problems completely automatically: 1. listing all equality and inequality constraints implied by a graphical model with hidden va ..."
Abstract
 Add to MetaCart
Recent improvements on Tarski's procedure for quantifier elimination in the first order theory of real numbers makes it feasible to solve small instances of the following problems completely automatically: 1. listing all equality and inequality constraints implied by a graphical model with hidden variables. 2. Comparing graphical models with hidden variables (i.e., model equivalence, inclusion, and overlap). 3. Answering questions about the identification of a model or portion of a model, and about bounds on quantities derived from a model. 4. Determining whether an independence assertion is implied from a given set of independence assertions. We discuss the foundations of quantifier elimination and demonstrate its application to these problems. 1