Results 1  10
of
58
Embedding Defaults into Terminological Knowledge Representation Formalisms
 Journal of Automated Reasoning
, 1995
"... We consider the problem of integrating Reiter's default logic into terminological representation systems. It turns out that such an integration is less straightforward than we expected, considering the fact that the terminological language is a decidable sublanguage of firstorder logic. Semanticall ..."
Abstract

Cited by 125 (6 self)
 Add to MetaCart
We consider the problem of integrating Reiter's default logic into terminological representation systems. It turns out that such an integration is less straightforward than we expected, considering the fact that the terminological language is a decidable sublanguage of firstorder logic. Semantically, one has the unpleasant effect that the consequences of a terminological default theory may be rather unintuitive, and may even vary with the syntactic structure of equivalent concept expressions. This is due to the unsatisfactory treatment of open defaults via Skolemization in Reiter's semantics. On the algorithmic side, we show that this treatment may lead to an undecidable default consequence relation, even though our base language is decidable, and we have only finitely many (open) defaults. Because of these problems, we then consider a restricted semantics for open defaults in our terminological default theories: default rules are only applied to individuals that are explicitly presen...
Equations and rewrite rules: a survey
 In Formal Language Theory: Perspectives and Open Problems
, 1980
"... bY ..."
Decision Problems for Propositional Linear Logic
, 1990
"... Linear logic, introduced by Girard, is a refinement of classical logic with a natural, intrinsic accounting of resources. We show that unlike most other propositional (quantifierfree) logics, full propositional linear logic is undecidable. Further, we prove that without the modal storage operator, ..."
Abstract

Cited by 90 (17 self)
 Add to MetaCart
Linear logic, introduced by Girard, is a refinement of classical logic with a natural, intrinsic accounting of resources. We show that unlike most other propositional (quantifierfree) logics, full propositional linear logic is undecidable. Further, we prove that without the modal storage operator, which indicates unboundedness of resources, the decision problem becomes pspacecomplete. We also establish membership in np for the multiplicative fragment, npcompleteness for the multiplicative fragment extended with unrestricted weakening, and undecidability for certain fragments of noncommutative propositional linear logic. 1 Introduction Linear logic, introduced by Girard [14, 18, 17], is a refinement of classical logic which may be derived from a Gentzenstyle sequent calculus axiomatization of classical logic in three steps. The resulting sequent system Lincoln@CS.Stanford.EDU Department of Computer Science, Stanford University, Stanford, CA 94305, and the Computer Science Labo...
System F with type equality coercions
, 2007
"... We introduce System FC, which extends System F with support for nonsyntactic type equality. There are two main extensions: (i) explicit witnesses for type equalities, and (ii) open, nonparametric type functions, given meaning by toplevel equality axioms. Unlike System F, FC is expressive enough to ..."
Abstract

Cited by 75 (25 self)
 Add to MetaCart
We introduce System FC, which extends System F with support for nonsyntactic type equality. There are two main extensions: (i) explicit witnesses for type equalities, and (ii) open, nonparametric type functions, given meaning by toplevel equality axioms. Unlike System F, FC is expressive enough to serve as a target for several different sourcelanguage features, including Haskell’s newtype, generalised algebraic data types, associated types, functional dependencies, and perhaps more besides.
Computability and recursion
 BULL. SYMBOLIC LOGIC
, 1996
"... We consider the informal concept of “computability” or “effective calculability” and two of the formalisms commonly used to define it, “(Turing) computability” and “(general) recursiveness.” We consider their origin, exact technical definition, concepts, history, general English meanings, how they b ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
We consider the informal concept of “computability” or “effective calculability” and two of the formalisms commonly used to define it, “(Turing) computability” and “(general) recursiveness.” We consider their origin, exact technical definition, concepts, history, general English meanings, how they became fixed in their present roles, how they were first and are now used, their impact on nonspecialists, how their use will affect the future content of the subject of computability theory, and its connection to other related areas. After a careful historical and conceptual analysis of computability and recursion we make several recommendations in section §7 about preserving the intensional differences between the concepts of “computability” and “recursion.” Specifically we recommend that: the term “recursive ” should no longer carry the additional meaning of “computable” or “decidable;” functions defined using Turing machines, register machines, or their variants should be called “computable” rather than “recursive;” we should distinguish the intensional difference between Church’s Thesis and Turing’s Thesis, and use the latter particularly in dealing with mechanistic questions; the name of the subject should be “Computability Theory” or simply Computability rather than
Probabilistic data exchange
 In Proc. ICDT
, 2010
"... The work reported here lays the foundations of data exchange in the presence of probabilistic data. This requires rethinking the very basic concepts of traditional data exchange, such as solution, universal solution, and the certain answers of target queries. We develop a framework for data exchange ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
The work reported here lays the foundations of data exchange in the presence of probabilistic data. This requires rethinking the very basic concepts of traditional data exchange, such as solution, universal solution, and the certain answers of target queries. We develop a framework for data exchange over probabilistic databases, and make a case for its coherence and robustness. This framework applies to arbitrary schema mappings, and finite or countably infinite probability spaces on the source and target instances. After establishing this framework and formulating the key concepts, we study the application of the framework to a concrete and practical setting where probabilistic databases are compactly encoded by means of annotations formulated over random Boolean variables. In this setting, we study the problems of testing for the existence of solutions and universal solutions, materializing such solutions, and evaluating target queries (for unions of conjunctive queries) in both the exact sense and the approximate sense. For each of the problems, we carry out a complexity analysis based on properties of the annotation, in various classes of dependencies. Finally, we show that the framework and results easily and completely generalize to allow not only the data, but also the schema mapping itself to be probabilistic.
Combination Techniques for NonDisjoint Equational Theories
 Proceedings 12th International Conference on Automated Deduction
, 1994
"... ion variables which are variables coming from an abstraction, either during preprocessing or during the algorithm itself. 3. Introduced variables which are variables introduced by the unification algorithms for each theory. We make the very natural assumption that the unification algorithm for each ..."
Abstract

Cited by 24 (4 self)
 Add to MetaCart
ion variables which are variables coming from an abstraction, either during preprocessing or during the algorithm itself. 3. Introduced variables which are variables introduced by the unification algorithms for each theory. We make the very natural assumption that the unification algorithm for each theory may recognize initial, abstraction and introduced variables and never assigns an introduced variable to a nonintroduced one or an abstraction variable to an initial one. With this assumption, our combination algorithm will always make an introduced variable appear in at most one \Gamma i . We may thus also suppose that the domain of each solution does not contain an introduced variable. This does not compromise the soundness of our algorithm. The combination algorithm is described by the two rules given in figure 2. In the rule UnifSolve i , ae SF is obtained by abstracting aliens in the range of ae by fresh variables. ae F i is the substitution such that xae = xae SF ae F i for al...
Axel Thue's work on repetitions in words
 Invited Lecture at the 4th Conference on Formal Power Series and Algebraic Combinatorics
, 1992
"... The purpose of this survey is to present, in contemporary terminology, the fundamental contributions of Axel Thue to the study of combinatorial properties of sequences of symbols, insofar as repetitions are concerned. The present state of the art is also sketched. ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
The purpose of this survey is to present, in contemporary terminology, the fundamental contributions of Axel Thue to the study of combinatorial properties of sequences of symbols, insofar as repetitions are concerned. The present state of the art is also sketched.
Consistency Checking in Complex Object Database Schemata with Integrity Constraints
, 1998
"... Integrity constraints are rules which should guarantee the integrity of a database. Provided that an adequate mechanism to express them is available, the following question arises: is there any way to populate a database which satisfies the constraints supplied by a database designer? i.e., does the ..."
Abstract

Cited by 20 (13 self)
 Add to MetaCart
Integrity constraints are rules which should guarantee the integrity of a database. Provided that an adequate mechanism to express them is available, the following question arises: is there any way to populate a database which satisfies the constraints supplied by a database designer? i.e., does the database schema, including constraints, admit at least a nonempty model? This work gives an answer to the above question in a complex object database environment, providing a theoretical framework including the following ingredients: two alternative formalisms, able to express a relevant set of state integrity constraints with a declarative style; two specialized reasoners, based on the tableaux calculus, able to check the consistency of complex objects database schemata expressed with the two formalisms. The proposed formalisms share a common kernel, which supports complex objects and object identifiers, and allow the expression of acyclic descriptions of: classes, nested relati...
Extensions of Simple Conceptual Graphs: the Complexity of Rules and Constraints
 JOUR. OF ARTIF. INTELL. RES
, 2002
"... Simple conceptual graphs are considered as the kernel of most knowledge representation formalisms built upon $owa's model. Reasoning in this model can be expressed by a graph homomorphism called projection, whose semantics is usually given in terms of positive, conjunctive, existential FOL. We pr ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Simple conceptual graphs are considered as the kernel of most knowledge representation formalisms built upon $owa's model. Reasoning in this model can be expressed by a graph homomorphism called projection, whose semantics is usually given in terms of positive, conjunctive, existential FOL. We present here a family of extensions of this model, based on rules and constraints, keeping graph homomorphism as the basic operation. We focus on the formal definitions of the different models obtained, including their operational semantics and relationships with FOL, and we analyze the decidability and complexity of the associated problems (consistency and deduction). As soon as rules are involved in reasonings, these problems are not decidable, but we exhibit a condition under which they fall in the polynomial hierarchy. These results extend and complete the ones already published by the authors. Moreover we systematically study the complexity of some particular cases obtained by restricting the form of constraints and/or rules.