Results 1  10
of
27
A New Evolutionary System for Evolving Artificial Neural Networks
 IEEE Transactions on Neural Networks
, 1996
"... This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP) [1], [2], [3]. Unlike most previous studies on evolving ANNs, this paper puts its emphasis on ev ..."
Abstract

Cited by 185 (35 self)
 Add to MetaCart
(Show Context)
This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP) [1], [2], [3]. Unlike most previous studies on evolving ANNs, this paper puts its emphasis on evolving ANN's behaviours. This is one of the primary reasons why EP is adopted. Five mutation operators proposed in EPNet reflect such an emphasis on evolving behaviours. Close behavioural links between parents and their offspring are maintained by various mutations, such as partial training and node splitting. EPNet evolves ANN's architectures and connection weights (including biases 1 ) simultaneously in order to reduce the noise in fitness evaluation. The parsimony of evolved ANNs is encouraged by preferring node/connection deletion to addition. EPNet has been tested on a number of benchmark problems in machine learning and ANNs, such as the parity problem, the medical diagnosis problems (bre...
Making Use of Population Information in Evolutionary Artificial Neural Networks
, 1998
"... This paper is concerned with the simultaneous evolution of artificial neural network (ANN) architectures and weights. The current practice in evolving ANNs is to choose the best ANN in the last generation as the final result. This paper proposes a different approach to form the final result by combi ..."
Abstract

Cited by 86 (25 self)
 Add to MetaCart
This paper is concerned with the simultaneous evolution of artificial neural network (ANN) architectures and weights. The current practice in evolving ANNs is to choose the best ANN in the last generation as the final result. This paper proposes a different approach to form the final result by combining all the individuals in the last generation in order to make best use of all the information contained in the whole population. This approach regards a population of ANNs as an ensemble and uses a combination method to integrate them. Although there has been some work on integrating ANN modules [2], [3], little has been done in evolutionary learning to make best use of its population information. Four linear combination methods have been investigated in this paper to illustrate our ideas. Three real world data sets have been used in our experimental studies, which show that the recursive least square (RLS) algorithm always produces an integrated system that outperforms the best individua...
Multicategory Classification by Support Vector Machines
 Computational Optimizations and Applications
, 1999
"... We examine the problem of how to discriminate between objects of three or more classes. Specifically, we investigate how twoclass discrimination methods can be extended to the multiclass case. We show how the linear programming (LP) approaches based on the work of Mangasarian and quadratic programm ..."
Abstract

Cited by 73 (0 self)
 Add to MetaCart
(Show Context)
We examine the problem of how to discriminate between objects of three or more classes. Specifically, we investigate how twoclass discrimination methods can be extended to the multiclass case. We show how the linear programming (LP) approaches based on the work of Mangasarian and quadratic programming (QP) approaches based on Vapnik's Support Vector Machines (SVM) can be combined to yield two new approaches to the multiclass problem. In LP multiclass discrimination, a single linear program is used to construct a piecewise linear classification function. In our proposed multiclass SVM method, a single quadratic program is used to construct a piecewise nonlinear classification function. Each piece of this function can take the form of a polynomial, radial basis function, or even a neural network. For the k > 2 class problems, the SVM method as originally proposed required the construction of a twoclass SVM to separate each class from the remaining classes. Similarily, k twoclass linear programs can be used for the multiclass problem. We performed an empirical study of the original LP method, the proposed k LP method, the proposed single QP method and the original k QP methods. We discuss the advantages and disadvantages of each approach. 1 1
An Efficient Method to Construct a Radial Basis Function Neural Network Classifier
, 1997
"... Radial basis function neural network(RBFN) has the power of the universal function approximation. But it is usually not straightforward how to construct an RBFN to solve a given problem. This paper describes a method to construct an RBFN classifier efficiently and effectively. The method determines ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
Radial basis function neural network(RBFN) has the power of the universal function approximation. But it is usually not straightforward how to construct an RBFN to solve a given problem. This paper describes a method to construct an RBFN classifier efficiently and effectively. The method determines the middle layer neurons by a fast clustering algorithm and computes the optimal weights between the middle and the output layers statistically. We applied the proposed method to construct an RBFN classifier for an unconstrained handwritten digit recognition. The experiment showed that the method could construct an RBFN classifier fast and the performance of the classifier was better than the best result previously reported. Keyword : Radial Basis Function, Linear Discriminant Function, Classification, APCIII, Clustering, GRBF, LMS, Handwritten Digit Recognition RBF Neural Network Classifier 2 1 INTRODUCTION Radial basis function neural network(RBFN) (Moody and Darken, 1989; Poggio and G...
Ensemble Structure of Evolutionary Artificial Neural Networks
 in Proc. of the 1996 IEEE Int'l Conf. on Evolutionary Computation (ICEC'96
, 1996
"... Evolutionary artificial neural networks (EANNs) refer to a special class of artificial neural networks (ANNs) in which evolution is another fundamental form of adaptation in addition to learning. Evolution can be introduced at various levels of ANNs. It can be used to evolve weights, architectures, ..."
Abstract

Cited by 25 (13 self)
 Add to MetaCart
(Show Context)
Evolutionary artificial neural networks (EANNs) refer to a special class of artificial neural networks (ANNs) in which evolution is another fundamental form of adaptation in addition to learning. Evolution can be introduced at various levels of ANNs. It can be used to evolve weights, architectures, and learning parameters and rules. This paper is concerned with the evolution of ANN architectures, where an evolutionary algorithm is used to evolve a population of ANNs. The current practice in evolving ANNs is to choose the best ANN in the last population as the final result. This paper proposes a novel approach to form the final result by combining all the individuals in the last generation in order to make best use of all the information contained in the whole population. This approach regards a population of ANNs as an ensemble of ANNs and use a method to combine them. We have used four simple methods in our computational studies. The first is the majority voting method. The second and...
Geometry in Learning
 In Geometry at Work
, 1997
"... One of the fundamental problems in learning is identifying members of two different classes. For example, to diagnose cancer, one must learn to discriminate between benign and malignant tumors. Through examination of tumors with previously determined diagnosis, one learns some function for distingui ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
(Show Context)
One of the fundamental problems in learning is identifying members of two different classes. For example, to diagnose cancer, one must learn to discriminate between benign and malignant tumors. Through examination of tumors with previously determined diagnosis, one learns some function for distinguishing the benign and malignant tumors. Then the acquired knowledge is used to diagnose new tumors. The perceptron is a simple biologically inspired model for this twoclass learning problem. The perceptron is trained or constructed using examples from the two classes. Then the perceptron is used to classify new examples. We describe geometrically what a perceptron is capable of learning. Using duality, we develop a framework for investigating different methods of training a perceptron. Depending on how we define the "best" perceptron, different minimization problems are developed for training the perceptron. The effectiveness of these methods is evaluated empirically on four practical applic...
An Efficient MDLBased Construction of RBF Networks
, 1998
"... We propose a method for optimizing the complexity of Radial Basis Function (RBF) networks. The method involves two procedures: adaptation (training) and selection. The first procedure adaptively changes the locations and the width of the basis functions and trains the linear weights. The selectio ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
We propose a method for optimizing the complexity of Radial Basis Function (RBF) networks. The method involves two procedures: adaptation (training) and selection. The first procedure adaptively changes the locations and the width of the basis functions and trains the linear weights. The selection procedure performs the elimination of the redundant basis functions using an objective function based on the Minimum Description Length (MDL) principle. By iteratively combining these two procedures we achieve a controlled way of training and modifying RBF networks, which balances accuracy, training time, and complexity of the resulting network. We test the proposed method on function approximation and classification tasks, and compare it to some other recently proposed methods.
Evolving Artificial Neural Networks for Medical Applications
 in Proc. of 1995 AustraliaKorea Joint Workshop on Evolutionary Computation
, 1995
"... Artificial neural network (ANN) architecture design has been one of the most tedious and difficult tasks in ANN applications due to the lack of satisfactory and systematic methods of designing a near optimal architecture. Evolutionary algorithms have been shown to be very effective in evolving novel ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
(Show Context)
Artificial neural network (ANN) architecture design has been one of the most tedious and difficult tasks in ANN applications due to the lack of satisfactory and systematic methods of designing a near optimal architecture. Evolutionary algorithms have been shown to be very effective in evolving novel ANN architectures for various problems. This paper proposes a new method for evolving ANN architectures and weights at the same time. The new method has been applied to four realworld data sets in the medical domain and achieved very good results. The traditional trialanderror approach to designing ANNs has been replaced by an automatic evolutionary system which can find a near optimal architecture and connection weights for a problem. 1 Introduction Artificial neural networks (ANNs) have been used widely in many application areas in recent years. Most applications use feedforward ANNs and the backpropagation (BP) training algorithm. There are numerous variants of the classical BP alg...
Artificial Neural Networks – A Science in Trouble
 SIGKDD Explorations, ACM
, 2000
"... This article points out some very serious misconceptions about the brain in connectionism and artificial neural networks. Some of the connectionist ideas have been shown to have logical flaws, while others are inconsistent with some commonly observed human learning processes and behavior. For exampl ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
This article points out some very serious misconceptions about the brain in connectionism and artificial neural networks. Some of the connectionist ideas have been shown to have logical flaws, while others are inconsistent with some commonly observed human learning processes and behavior. For example, the connectionist ideas have absolutely no provision for learning from stored information, something that humans do all the time. The article also argues that there is definitely a need for some new ideas about the internal mechanisms of the brain. It points out that a very convincing argument can be made for a "control theoretic " approach to understanding the brain. A "control theoretic " approach is actually used in all connectionist and neural network algorithms and it can also be justified from recent neurobiological evidence. A control theoretic approach proposes that there are subsystems within the brain that control other subsystems. Hence a similar approach can be taken in constructing learning algorithms and other intelligent systems.
The Extraction And Comparison Of Knowledge From Local Function Networks
"... this paper is that local function networks such as radial basis function (RBF) networks have a suitable architecture based on Gaussian functions that is amenable to rule extraction ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
this paper is that local function networks such as radial basis function (RBF) networks have a suitable architecture based on Gaussian functions that is amenable to rule extraction