Results 1  10
of
62
The Extraction of Refined Rules from KnowledgeBased Neural Networks
 Machine Learning
, 1993
"... Neural networks, despite their empiricallyproven abilities, have been little used for the refinement of existing knowledge because this task requires a threestep process. First, knowledge in some form must be inserted into a neural network. Second, the network must be refined. Third, knowledge mus ..."
Abstract

Cited by 198 (4 self)
 Add to MetaCart
Neural networks, despite their empiricallyproven abilities, have been little used for the refinement of existing knowledge because this task requires a threestep process. First, knowledge in some form must be inserted into a neural network. Second, the network must be refined. Third, knowledge must be extracted from the network. We have previously described a method for the first step of this process. Standard neural learning techniques can accomplish the second step. In this paper, we propose and empirically evaluate a method for the final, and possibly most difficult, step. This method efficiently extracts symbolic rules from trained neural networks. The four major results of empirical tests of this method are that the extracted rules: (1) closely reproduce (and can even exceed) the accuracy of the network from which they are extracted; (2) are superior to the rules produced by methods that directly refine symbolic rules; (3) are superior to those produced by previous techniques fo...
KnowledgeBased Artificial Neural Networks
, 1994
"... Hybrid learning methods use theoretical knowledge of a domain and a set of classified examples to develop a method for accurately classifying examples not seen during training. The challenge of hybrid learning systems is to use the information provided by one source of information to offset informat ..."
Abstract

Cited by 145 (13 self)
 Add to MetaCart
Hybrid learning methods use theoretical knowledge of a domain and a set of classified examples to develop a method for accurately classifying examples not seen during training. The challenge of hybrid learning systems is to use the information provided by one source of information to offset information missing from the other source. By so doing, a hybrid learning system should learn more effectively than systems that use only one of the information sources. KBANN(KnowledgeBased Artificial Neural Networks) is a hybrid learning system built on top of connectionist learning techniques. It maps problemspecific "domain theories", represented in propositional logic, into neural networks and then refines this reformulated knowledge using backpropagation. KBANN is evaluated by extensive empirical tests on two problems from molecular biology. Among other results, these tests show that the networks created by KBANN generalize better than a wide variety of learning systems, as well as several t...
Extracting Comprehensible Models from Trained Neural Networks
, 1996
"... To Mom, Dad, and Susan, for their support and encouragement. ..."
Abstract

Cited by 69 (4 self)
 Add to MetaCart
To Mom, Dad, and Susan, for their support and encouragement.
A framework for combining symbolic and neural learning
, 1992
"... This article describes an approach to combining symbolic and connectionist approaches to machine learning. A threestage framework is presented and the research of several groups is reviewed with respect to this framework. The first stage involves the insertion of symbolic knowledge into neural netw ..."
Abstract

Cited by 56 (1 self)
 Add to MetaCart
This article describes an approach to combining symbolic and connectionist approaches to machine learning. A threestage framework is presented and the research of several groups is reviewed with respect to this framework. The first stage involves the insertion of symbolic knowledge into neural networks, the second addresses the refinement of this prior knowledge in its neural representation, while the third concerns the extraction of the refined symbolic knowledge. Experimental results and open research issues are discussed.
Induction of Recursive Bayesian Classifiers
 In Brazdil P.B.(ed.), Machine Learning: ECML93
, 1993
"... Abstract. In this paper, we review the induction of simple Bayesian classifiers, note some of their drawbacks, and describe a recursive algorithm that constructs a hierarchy of probabilistic concept descriptions. We posit that this approach should outperform the simpler scheme in domains that involv ..."
Abstract

Cited by 49 (3 self)
 Add to MetaCart
Abstract. In this paper, we review the induction of simple Bayesian classifiers, note some of their drawbacks, and describe a recursive algorithm that constructs a hierarchy of probabilistic concept descriptions. We posit that this approach should outperform the simpler scheme in domains that involve disjunctive concepts, since they violate the independence assumption on which the latter relies. To test this hypothesis, we report experimental studies with both natural and artificial domains. The results are mixed, but they are encouraging enough to recommend closer examination of recursive Bayesian classifiers in future work. 1.
Symbolic knowledge extraction from trained neural networks: A sound approach
, 2001
"... Although neural networks have shown very good performance in many application domains, one of their main drawbacks lies in the incapacity to provide an explanation for the underlying reasoning mechanisms. The "explanation capability" of neural networks can be achieved by the extraction of symbolic k ..."
Abstract

Cited by 44 (6 self)
 Add to MetaCart
Although neural networks have shown very good performance in many application domains, one of their main drawbacks lies in the incapacity to provide an explanation for the underlying reasoning mechanisms. The "explanation capability" of neural networks can be achieved by the extraction of symbolic knowledge. In this paper, we present a new method of extraction that captures nonmonotonic rules encoded in the network, and prove that such a method is sound. We start by discussing some of the main problems of knowledge extraction methods. We then discuss how these problems may be ameliorated. To this end, a partial ordering on the set of input vectors of a network is defined, as well as a number of pruning and simplification rules. The pruning rules are then used to reduce the search space of the extraction algorithm during a pedagogical extraction, whereas the simplification rules are used to reduce the size of the extracted set of rules. We show that, in the case of regular networks, the extraction algorithm is sound and complete. We proceed to extend the extraction algorithm to the class of nonregular networks, the general case. We show that nonregular networks always contain regularities in their subnetworks. As a result, the underlying extraction method for regular networks can be applied, but now in a decompositional fashion. In order to combine the sets of rules extracted from each subnetwork into the final set of rules, we use a method whereby we are able to keep the soundness of the extraction algorithm. Finally, we present the results of an empirical analysis of the extraction system, using traditional examples and realworld application problems. The results have shown that a very high fidelity between the extracted set of rules and the network can be achieved....
Learning Symbolic Rules Using Artificial Neural Networks
 Proceedings of the Tenth International Conference on Machine Learning
, 1993
"... A distinct advantage of symbolic learning algorithms over artificial neural networks is that typically the concept representations they form are more easily understood by humans. One approach to understanding the representations formed by neural networks is to extract symbolic rules from trained net ..."
Abstract

Cited by 41 (6 self)
 Add to MetaCart
A distinct advantage of symbolic learning algorithms over artificial neural networks is that typically the concept representations they form are more easily understood by humans. One approach to understanding the representations formed by neural networks is to extract symbolic rules from trained networks. In this paper we describe and investigate an approach for extracting rules from networks that uses (1) the NofM extraction algorithm, and (2) the network training method of soft weightsharing. Previously, the NofM algorithm had been successfully applied only to knowledgebased neural networks. Our experiments demonstrate that our extracted rules generalize better than rules learned using the C4.5 system. In addition to being accurate, our extracted rules are also reasonably comprehensible. 1 INTRODUCTION Artificial neural networks (ANNs) have been successfully applied to realworld problems as varied as steering a motor vehicle (Pomerleau, 1991) and learning to pronounce English tex...
Symbolic Revision of Theories with MofN Rules
 In Proceedings of the Thirteenth International Joint Conference on Artificial Intelligence
, 1993
"... This paper presents a major revision of the Either propositional theory refinement system. Two issues are discussed. First, we show how run time efficiency can be greatly improved by changing from a exhaustive scheme for computing repairs to an iterative greedy method. Second, we show how to extend ..."
Abstract

Cited by 38 (5 self)
 Add to MetaCart
This paper presents a major revision of the Either propositional theory refinement system. Two issues are discussed. First, we show how run time efficiency can be greatly improved by changing from a exhaustive scheme for computing repairs to an iterative greedy method. Second, we show how to extend Either to refine MofN rules. The resulting algorithm, Neither (New Either), is more than an order of magnitude faster and produces significantly more accurate results with theories that fit the MofN format. To demonstrate the advantages of Neither, we present preliminary experimental results comparing it to Either and various other systems on refining the DNA promoter domain theory. 1 Introduction Recently, a number of machine learning systems have been developed that use examples to revise an approximate (incomplete and/or incorrect) domain theory [ Ginsberg, 1990; Ourston and Mooney, 1990; Towell and Shavlik, 1991; Danyluk, 1991; Whitehall et al., 1991; Matwin and Plante, 1991 ] . ...
Using KnowledgeBased Neural Networks to Improve Algorithms: Refining the ChouFasman Algorithm for Protein Folding
 Machine Learning
, 1993
"... We describe a method for using machine learning to refine algorithms represented as generalized finitestate automata. The knowledge in an automaton is translated into an artificial neural network, and then refined with backpropagation on a set of examples. Our technique for translating an automaton ..."
Abstract

Cited by 35 (6 self)
 Add to MetaCart
We describe a method for using machine learning to refine algorithms represented as generalized finitestate automata. The knowledge in an automaton is translated into an artificial neural network, and then refined with backpropagation on a set of examples. Our technique for translating an automaton into a network extends kbann, a system that translates a set of propositional rules into a corresponding neural network. The extended system, FSkbann, allows one to refine the large class of algorithms that can be represented as statebased processes. As a test, we use FSkbann to refine the ChouFasman algorithm, a method for predicting how globular proteins fold. Empirical evidence shows the refined algorithm FSkbann produces is statistically significantly more accurate than both the original ChouFasman algorithm and a neural network trained using the standard approach. Introduction As machine learning has been increasingly applied to complex realworld problems, many researchers have...
An Overview Of Strategies For Neurosymbolic Integration
, 1995
"... This paper will give an overview of the various approaches to neurosymbolic integration. Roughly, these can be divided into two strategies: unified strategies aim at attaining neural and symbolic capabilities using neural networks alone, while hybrid strategies combine neural networks with symbolic ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
This paper will give an overview of the various approaches to neurosymbolic integration. Roughly, these can be divided into two strategies: unified strategies aim at attaining neural and symbolic capabilities using neural networks alone, while hybrid strategies combine neural networks with symbolic models such as expert systems, casebased reasoning systems, 2 Chapter 2 and decision trees. These two approaches form the main subtrees of the classification hierarchy depicted in Figure 1. Symbol Proc. Neuronal Unified approach Symbol Proc. hybrids Connectionist Localist Hybrid approach Combined L/D Neurosymbolic integration Functional Chainprocessing Translational Subprocessing hybrids Metaprocessing Distributed Coprocessing Figure 1 Classification of integrated neurosymbolic systems.