## The Complexity of Theory Revision (1998)

Venue: | In Proceedings of IJCAI-95 |

Citations: | 16 - 5 self |

### BibTeX

@INPROCEEDINGS{Greiner98thecomplexity,

author = {Russell Greiner},

title = {The Complexity of Theory Revision},

booktitle = {In Proceedings of IJCAI-95},

year = {1998},

pages = {175--217},

publisher = {Morgan Kaufmann}

}

### OpenURL

### Abstract

A knowledge-based system uses its database (a.k.a. its "theory") to produce answers to the queries it receives. Unfortunately, these answers may be incorrect if the underlying theory is faulty. Standard "theory revision" systems use a given set of "labeled queries" (each a query paired with its correct answer) to transform the given theory, by adding and/or deleting either rules and/or antecedents, into a related theory that is as accurate as possible. After formally defining the theory revision task, this paper provides both sample and computational complexity bounds for this process. It first specifies the number of labeled queries necessary to identify a revised theory whose error is close to minimal with high probability. It then considers the computational complexity of finding this best theory, and proves that, unless P = NP , no polynomial time algorithm can identify this near-optimal revision, even given the exact distribution of queries, except in certain simple situation. It ...

### Citations

10903 |
Computers and Intractability: A Guide to the Theory of NP-Completeness
- Johnson
- 1977
(Show Context)
Citation Context ..., attempting to approximate a more general target function, which here need not even correspond to a logical theory. (See also the comparison in Section 4.4.) 1 Throughout, we will assume that P 6=NP =-=[GJ79]-=-, which implies that any NP-hard problem is intractable. This also implies certain approximation claims, presented below. The Complexity of Theory Revision 4 There are several implemented theory revis... |

4931 |
C4.5: Programs for Machine Learning
- Quinlan
- 1993
(Show Context)
Citation Context ...ed Results: Our underlying task, of producing a theory that is as correct as possible, is the main objective of most research in inductive learning, including as notable instances cart [BFOS84], c4.5 =-=[Qui92]-=- and connectionist learning algorithms [Hin89]. While many of these systems learn descriptions based on bit vectors or simple hierarchies, our work deals with logical descriptions. Here too there is a... |

3904 |
Classification and Regression Trees
- Breiman, Friedman, et al.
- 1984
(Show Context)
Citation Context ...research. Related Results: Our underlying task, of producing a theory that is as correct as possible, is the main objective of most research in inductive learning, including as notable instances cart =-=[BFOS84]-=-, c4.5 [Qui92] and connectionist learning algorithms [Hin89]. While many of these systems learn descriptions based on bit vectors or simple hierarchies, our work deals with logical descriptions. Here ... |

1792 | Random graphs - Bollobas - 2001 |

1055 | Inductive logic programming
- Muggleton
- 1991
(Show Context)
Citation Context ... Here too there is a history, dating back (at least) to Plotkin [Plo71] and Shapiro [Sha83], and including the more contemporary foil [Qui90] and the body of work on inductive logic programming (ILP) =-=[Mug92]. However,-=- while most of these projects begin with an "empty theory" and attempt to learn a target logic program by adding new clauses, theory revision processes work by modifying a given initial theo... |

960 |
Negation as failure
- Clark
- 1978
(Show Context)
Citation Context ..., including the above special cases in the context where our underlying theories can use the not(\Delta) operator to return Yes if the specified goal cannot be proven; i.e., using Negation-as-Failure =-=[Cla78]-=-. It also considers the effect of re-ordering the rules and the antecedents, in the context where such shufflings can affect the answers returned. In most of these cases, we show that the correspondin... |

854 | A tutorial on learning with bayesian networks
- Heckerman
- 1995
(Show Context)
Citation Context ...or example, many Bayesian systems use such observations to update their representations, often by adjusting the (continuous) parameters in a Dirichlet distribution within a given belief net structure =-=[Hec95]-=-. We, however, are making discrete changes to the structure of the Horn theory. Similarly, belief revision systems [AGM85, Dal88, Gar88, KM91] take as input an 2 (1) However, we make no claims concern... |

853 | Leaming Logical Definitions from Relations
- Quinlan
- 1990
(Show Context)
Citation Context ...s or simple hierarchies, our work deals with logical descriptions. Here too there is a history, dating back (at least) to Plotkin [Plo71] and Shapiro [Sha83], and including the more contemporary foil =-=[Qui90] and the b-=-ody of work on inductive logic programming (ILP) [Mug92]. However, while most of these projects begin with an "empty theory" and attempt to learn a target logic program by adding new clauses... |

802 | Estimation of Dependencies Based on Empirical Data - Vapnik - 1982 |

714 | On the Logic of Theory Change: Partial Meet Contraction and Revision Functions. Journal of Symbolic Logic, 50(2):510–530, 1985. Ontology Alignment – also called Ontology Matching, is the process of finding the mappings between entities from two different - Alchourrón, Gärdenfors, et al. |

714 | A Measure of Asymptotic Efficiency of Tests for a Hypothesis Based on a Sum of Observations - Chernoff - 1952 |

624 | Learnability and the vapnik-chervonenkis dimension - Blumer, Ehrenfeucht, et al. - 1989 |

612 | Knowledge in Flux: Modeling the Dynamics of Epistemic States - Gärdenfors - 1988 |

470 |
Algorithmic Program DeBugging
- Shapiro
- 1983
(Show Context)
Citation Context ...ese systems learn descriptions based on bit vectors or simple hierarchies, our work deals with logical descriptions. Here too there is a history, dating back (at least) to Plotkin [Plo71] and Shapiro =-=[Sha83], and incl-=-uding the more contemporary foil [Qui90] and the body of work on inductive logic programming (ILP) [Mug92]. However, while most of these projects begin with an "empty theory" and attempt to ... |

402 | On the difference between updating a knowledge base and revising it - Katsuno, Mendelzon - 1991 |

381 |
On the hardness of approximating minimization problems
- LUND, YANNAKAKIS
- 1994
(Show Context)
Citation Context ...mated. Following [CP91, Kan92], we define Definition 2 A minimization problem MinP is PolyApprox if 8fl 2 ! + ; 9B fl 2 Poly( MinP ); 8x 2 MinP; MinPerf[MinP]( B fl ; x )sjxj fl . Lund and Yannakakis =-=[LY93] prove tha-=-t (unless P = NP ) the "MinGraphColor minimization problem" is not PolyApprox --- i.e., there is some fl 2 ! + such that no polynomial-time algorithm can always find a solution within jxj fl... |

339 | Connectionist Learning Procedures
- Hinton
(Show Context)
Citation Context ...a theory that is as correct as possible, is the main objective of most research in inductive learning, including as notable instances cart [BFOS84], c4.5 [Qui92] and connectionist learning algorithms =-=[Hin89]-=-. While many of these systems learn descriptions based on bit vectors or simple hierarchies, our work deals with logical descriptions. Here too there is a history, dating back (at least) to Plotkin [P... |

240 | Investigations into a Theory of Knowledge Base Revision: Preliminary Report
- Dalal
- 1988
(Show Context)
Citation Context ... 5 initial theory T 0 and a new assertion hq; +i (resp., a new retraction hr; \Gammai) and return a new consistent theory T 0 that entails q (resp., does not entail r) but otherwise is "close&quo=-=t; to T 0 [Dal88]-=-. In general, the resulting revised theory will not depend on the syntactic structure of the initial theory --- i.e., if T 1 j T 2 , then the theory obtained by revising T 1 with the assertion hq; +i ... |

223 |
Quantifying inductive bias: AI learning algorithms and Valiant’s learning framework
- Haussler
- 1988
(Show Context)
Citation Context ... the VCdimension of the theory set \Upsilon +A [T n ], formed by applying add-antecedent transformations, is 11 Readers wishing to learn yet more about "Vapnik-Chervonenkis Dimension" are re=-=ferred to [Hau88]-=-. The Complexity of Theory Revision 15 exponential in n; i.e., where VCdimQ ( \Upsilon +A [T n ] )s2 n . This holds even if all of the queries are atomic, they all correspond to simple instantiations ... |

197 | Efficient Distribution-free Learning of Probabilistic Concepts
- Kearns, RE
- 1994
(Show Context)
Citation Context ... set of observations, different repairs are appropriate at different times; this could happen, for example, if the correct repair depends on some unobserved variables as well as the observations; see =-=[KS90]-=-. Notice here that err(T; q) = 1 \Gamma O 0 (q; T(q)); and that our deterministic oracle is a special case of this, where O 0 (q; a q ) = 1 for a single a q 2 A and O 0 (q; a) = 0 for all a 6= a q . T... |

191 | A general lower bound on the number of examples needed for learning - Ehrenfeucht, Haussler, et al. - 1989 |

186 | On the complexity of propositional knowledge base revision, updates and counterfactual
- Eiter, Gottlob
- 1992
(Show Context)
Citation Context ...sion frameworks deal with arbitrary CNF formulae. (Of course, the standard belief revision tasks --- e.g., the "counterfactual problem" --- are complete for higher levels in polynomial-time =-=hierarchy [EG92].) Notice -=-theory revision seeks a theory, from within the syntactically defined class of "all theories produced by applying certain syntactical modifications to an initial theory", whose performance i... |

167 | On the logic of iterated belief revision - Darwiche, Pearl - 1997 |

151 | Foundations of a functional approach to knowledge representation - Levesque - 1984 |

122 | Two theses of knowledge representation: Language restrictions, taxonomic classification, and the utility of representation services - Doyle, Patil - 1991 |

113 | Theory refinement: Combining analytical and empirical methods
- Ourston, Mooney
- 1994
(Show Context)
Citation Context ...omplexity of Theory Revision 4 There are several implemented theory revision systems. Most use essentially the same set of transformations described here --- e.g., Audrey [WP93], Fonte [MB88], Eithers=-=[OM94]-=- and Delta [LDRG94] each consider adding or deleting antecedents or rules. Our analysis, and results, can easily be applied to many other types of modifications --- e.g., specializing or generalizing ... |

106 | Knowledge compilation using horn approximations - Selman, Kautz - 1991 |

79 |
Automatic Methods of Inductive Inference
- Plotkin
- 1971
(Show Context)
Citation Context ...9]. While many of these systems learn descriptions based on bit vectors or simple hierarchies, our work deals with logical descriptions. Here too there is a history, dating back (at least) to Plotkin =-=[Plo71] and -=-Shapiro [Sha83], and including the more contemporary foil [Qui90] and the body of work on inductive logic programming (ILP) [Mug92]. However, while most of these projects begin with an "empty the... |

79 |
Symbolic knowledge and neural networks : insertion re nement and extraction. Rapport technique 1072
- Towell
(Show Context)
Citation Context ... the Horn theory. Similarly, belief revision systems [AGM85, Dal88, Gar88, KM91] take as input an 2 (1) However, we make no claims concerning the applicability of our techniques to systems like KBANN =-=[Tow91]-=-, which use a completely different means of modifying a theory. (2) The companion paper [Gre99] considers yet other ways of modifying a theory, viz., by rearranging the order of its component rules or... |

78 | Reasoning with models - Khardon, Roth - 1996 |

75 | Structure identification in relational data
- Dechter, Pearl
- 1992
(Show Context)
Citation Context ...resent situations where the computational task is not just intractable, but is not even approximatable. Second, many works on "approximations" [BE89, SK91, DE92, GS92] and "structural i=-=dentification" [DP92]-=- seek a theory, of a specified syntactic form, that is semantically close to an explicitly given theory T target (i.e., which entails essentially the same set of propositions that T target entails). A... |

74 | Completeness in approximation classes - Crescenzi, Panconesi - 1991 |

73 | Revision sequences and nested conditionals - Boutilier - 1993 |

73 | PAC-learnability of determinate logic programs - Dzeroski, Muggleton, et al. - 1992 |

68 | On the Approximability of NP-complete Optimization Problems - Kann - 1992 |

62 | Belief revision: A critique
- Friedman, Halpern
- 1996
(Show Context)
Citation Context ... deals with a sequence of assertions, where each new assertion must be incorporated, as it arrives. Afterwards, it is no longer distinguished from any other information in the current theory (but see =-=[FH96]-=-). We, however, consider the assertions as a set , which is seen at once, and whose elements need not all be incorporated. 4 (i) A k-Horn theory is a Horn theory, defined below, whose clauses each con... |

56 | Learning to reason - Khardon, Roth - 1997 |

40 |
Machine invention of first order predicates by inverting resolution
- Muggleton, Buntine
- 1988
(Show Context)
Citation Context ...ed below. The Complexity of Theory Revision 4 There are several implemented theory revision systems. Most use essentially the same set of transformations described here --- e.g., Audrey [WP93], Fonte =-=[MB88]-=-, Eithers[OM94] and Delta [LDRG94] each consider adding or deleting antecedents or rules. Our analysis, and results, can easily be applied to many other types of modifications --- e.g., specializing o... |

40 | and Mihalis Yannakakis. On the hardness of approximating minimization problems - Lund - 1994 |

39 |
Learning from entailment: An application to propositional Horn sentences
- Frazier, Pitt
- 1993
(Show Context)
Citation Context ...results deal with a situation that differs from the standard ILP task. In fact, many of these tasks become easy if we consider only target functions that correspond to Horn theories. Frazier and Pitt =-=[FP93]-=-, however, prove that learning a perfect Horn theory from Horn queries (which corresponds to ThRev P rop;Horn;P erf [\Upsilon 1 ] when the target oracle is in OHorn ) is as hard as learning arbitrary ... |

38 | Symbolic revision of theories with M-of-N rules
- Baffes
- 1993
(Show Context)
Citation Context ...or deleting antecedents or rules. Our analysis, and results, can easily be applied to many other types of modifications --- e.g., specializing or generalizing antecedents [OM94], using "n-of-m ru=-=les" [BM93]-=-, or merging rules and removing chains of rules that produced incorrect results [Coh90, Coh92]. 2 While these projects provide empirical evidence for the effectiveness of their specific algorithms, an... |

36 | Reasoning With Characteristic Models - Kautz, Kearns - 1993 |

35 | PAC-learning recursive logic programs: Negative results - Cohen - 1995 |

34 | PAC-learning recursive logic programs: Efficient algorithms - Cohen - 1995 |

32 | Belief revision and rational inference - Freund, Lehmann - 1994 |

29 |
Polynomial-time inference of all valid implications for horn and related formulae
- Boros, Crama, et al.
- 1990
(Show Context)
Citation Context ...lynomial-time oracle that performs these arbitrary derivations . Of course, as we are considering only Horn theories, these computations are guaranteed to be polynomial-time in the propositional case =-=[BCH90]-=-. The Complexity of Theory Revision 9 T = a theory; i.e., a set of Horn clauses L = the language used Set of transformations \Upsilonsthat map a theory T to a set of new theories \Upsilons(T) \Upsilon... |

29 | Hierarchical knowledge bases and efficient disjunctive reasoning, in - Borgida, Etherington - 1989 |

29 |
Optimizing existential Datalog queries
- Ramakrishnan, Beeri, et al.
- 1988
(Show Context)
Citation Context ...1]. We will use Yes[X= ?] to indicate that there is an instantiation that is satisfied, but the particular value of that instantiation is not important. (This corresponds to an "existential quest=-=ion" [RBK88].) All of -=-the results in this paper hold even when considering only nonrecursive theories; and all computational results hold even for Datalog (i.e., "function-free") theories. As a related extension,... |

28 | Results on learnability and the Vapnik-Chervonenkis dimension
- Linial, Mansour, et al.
- 1988
(Show Context)
Citation Context ...ng sequences of transformations with cost at most two), and return any perfect T 2 2 \Upsilon 2 [T 0 ]; and so forth. (Notice this may involve using successively more samples on each iteration, `a la =-=[LMR88]-=-.) 4.2 Approximatability Many decision problems correspond immediately to optimization problems; for example, the MinGraphColor decision problem Given a graph G = hN; Ei and a positive integer K, can ... |

25 | Learning from textbook knowledge: A case study
- Cohen
- 1990
(Show Context)
Citation Context ...r than general derivation, our work formally addresses the complexities inherent in finding the best theory, for handling arbitrary queries. There are several related complexity results: First, Cohen =-=[Coh90]-=- observed that the challenge of computing the smallest modification was intractable in a particular context; this relates to our Corollary 4.1. Second, Wilkins and Ma [WM94] show the intractability of... |