Results 1 
5 of
5
Theory Refinement Combining Analytical and Empirical Methods
 Artificial Intelligence
, 1994
"... This article describes a comprehensive approach to automatic theory revision. Given an imperfect theory, the approach combines explanation attempts for incorrectly classified examples in order to identify the failing portions of the theory. For each theory fault, correlated subsets of the examples a ..."
Abstract

Cited by 113 (7 self)
 Add to MetaCart
This article describes a comprehensive approach to automatic theory revision. Given an imperfect theory, the approach combines explanation attempts for incorrectly classified examples in order to identify the failing portions of the theory. For each theory fault, correlated subsets of the examples are used to inductively generate a correction. Because the corrections are focused, they tend to preserve the structure of the original theory. Because the system starts with an approximate domain theory, in general fewer training examples are required to attain a given level of performance (classification accuracy) compared to a purely empirical system. The approach applies to classification systems employing a propositional Hornclause theory. The system has been tested in a variety of application domains, and results are presented for problems in the domains of molecular biology and plant disease diagnosis. 1 INTRODUCTION 2 1 Introduction One of the most difficult problems in the develo...
Incremental NonBacktracking Focusing: A Polynomially Bounded Generalization Algorithm for Version Spaces
 In Proceedings of the Eight National Conference on Arti Intelligence (AAAI90
, 1990
"... The candidate elimination algorithm for inductive learning with version spaces can require both exponential time and space. This article describes the Incremental NonBacktracking Focusing (INBF) algorithm which learns strictly treestructured concepts in polynomial space and time. Specifically, it ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
The candidate elimination algorithm for inductive learning with version spaces can require both exponential time and space. This article describes the Incremental NonBacktracking Focusing (INBF) algorithm which learns strictly treestructured concepts in polynomial space and time. Specifically, it learns in time O(pnk) and space O(nk) where p is the number of positives, n the number of negatives, and k the number of features. INBF is an extension of an existing batch algorithm, Avoidance Focusing (AF). Although AF also learns in polynomial time, it assumes a convergent set of positive examples, and handles additional examples inefficiently; INBF has neither of these restrictions. Both the AF and INBF algorithms assume that the positive examples plus the near misses will be sufficient for convergence if the initial set of examples is convergent. This article formally proves that for treestructured concepts this assumption does in fact hold. Introduction The candidate elimination (CE)...
A Preliminary PAC Analysis of Theory Revision
 Computational Learning Theory and Natural Learning Systems
, 1993
"... This paper presents a preliminary analysis of the sample complexity of theory revision within the framework of PAC (Probably Approximately Correct) learnability theory. By formalizing the notion that the initial theory is "close" to the correct theory we show that the sample complexity of an optimal ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
This paper presents a preliminary analysis of the sample complexity of theory revision within the framework of PAC (Probably Approximately Correct) learnability theory. By formalizing the notion that the initial theory is "close" to the correct theory we show that the sample complexity of an optimal propositional Hornclause theory revision algorithm is O((ln 1=ffi + d ln(s 0 + d + n))=ffl), where d is the syntactic distance between the initial and correct theories, s 0 is the size of initial theory, n is the number of observable features, and ffl and ffi are the standard PAC error and probability bounds. The paper also discusses the problems raised by the computational complexity of theory revision. This research was supported by the National Science Foundation under grant IRI9102926, the NASA Ames Research Center under grant NCC 2629, the Texas Advanced Research Program under grant 003658114. 1 Introduction Although there has recently been a great deal of empirical work on co...
Induction over the unexplained: Using overlygeneral domain theories to aid concept learning
, 1993
"... This paper describes and evaluates an approach to combining empirical and explanationbased learning called Induction Over the Unexplained (IOU). IOU is intended for learning concepts that can be partially explained by an overlygeneral domain theory. An eclectic evaluation of the method is presented ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
This paper describes and evaluates an approach to combining empirical and explanationbased learning called Induction Over the Unexplained (IOU). IOU is intended for learning concepts that can be partially explained by an overlygeneral domain theory. An eclectic evaluation of the method is presented which includes results from all three major approaches: empirical, theoretical, and psychological. Empirical results shows that IOU is effective at refining overlygeneral domain theories and that it learns more accurate concepts from fewer examples than a purely empirical approach. The application of theoretical results from PAC learnability theory explains why IOU requires fewer examples. IOU is also shown to be able to model psychological data demonstrating the effect of background knowledge on human learning.
Rule Generation and Compaction in the
, 1998
"... In this paper we discuss our approach to learning classification rules from data. We sketch out two modules of our architecture, namely LINNEO + and GAR. LINNEO + , which is a knowledge acquisition tool for illstructured domains automatically generating classes from examples that incrementally ..."
Abstract
 Add to MetaCart
In this paper we discuss our approach to learning classification rules from data. We sketch out two modules of our architecture, namely LINNEO + and GAR. LINNEO + , which is a knowledge acquisition tool for illstructured domains automatically generating classes from examples that incrementally works with an unsupervised strategy. LINNEO + 's output, a representation of the conceptual structure of the domain in terms of classes, is the input to GAR that is used to generate a set of classification rules for the original training set. GAR can generate both conjunctive and disjunctive rules. Herein we present an application of these techniques to data obtained from a real wastewater treatment plant in order to help the construction of a rule base. This rule will be used for a knowledgebased system that aims to supervise the whole process. 1 Introduction For a long time, engineers and scientists have been developing complex models to describe the timevarying nature of environmenta...