Results 1  10
of
48
A New Evolutionary System for Evolving Artificial Neural Networks
 IEEE Transactions on Neural Networks
, 1996
"... This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP) [1], [2], [3]. Unlike most previous studies on evolving ANNs, this paper puts its emphasis on ev ..."
Abstract

Cited by 185 (35 self)
 Add to MetaCart
(Show Context)
This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP) [1], [2], [3]. Unlike most previous studies on evolving ANNs, this paper puts its emphasis on evolving ANN's behaviours. This is one of the primary reasons why EP is adopted. Five mutation operators proposed in EPNet reflect such an emphasis on evolving behaviours. Close behavioural links between parents and their offspring are maintained by various mutations, such as partial training and node splitting. EPNet evolves ANN's architectures and connection weights (including biases 1 ) simultaneously in order to reduce the noise in fitness evaluation. The parsimony of evolved ANNs is encouraged by preferring node/connection deletion to addition. EPNet has been tested on a number of benchmark problems in machine learning and ANNs, such as the parity problem, the medical diagnosis problems (bre...
On the computation of all global minimizers through particle swarm optimization
 IEEE Transactions on Evolutionary Computation
, 2004
"... Abstract—This paper presents approaches for effectively computing all global minimizers of an objective function. The approaches include transformations of the objective function through the recently proposed deflection and stretching techniques, as well as a repulsion source at each detected minimi ..."
Abstract

Cited by 63 (17 self)
 Add to MetaCart
(Show Context)
Abstract—This paper presents approaches for effectively computing all global minimizers of an objective function. The approaches include transformations of the objective function through the recently proposed deflection and stretching techniques, as well as a repulsion source at each detected minimizer. The aforementioned techniques are incorporated in the context of the particle swarm optimization (PSO) method, resulting in an efficient algorithm which has the ability to avoid previously detected solutions and, thus, detect all global minimizers of a function. Experimental results on benchmark problems originating from the fields of global optimization, dynamical systems, and game theory, are reported, and conclusions are derived. Index Terms—Deflection technique, detecting all minimizers, dynamical systems, Nash equilibria, particle swarm optimization (PSO), periodic orbits, stretching technique. I.
Statistical Control of RBFlike Networks for Classification
 In 7th International Conference on Artificial Neural Networks
, 1997
"... . Incremental Net Pro (IncNet Pro) with local learning feature and statistically controlled growing and pruning of the network is introduced. The architecture of the net is based on RBF networks. Extended Kalman Filter algorithm and its new fast version is proposed and used as learning algorithm. In ..."
Abstract

Cited by 30 (14 self)
 Add to MetaCart
. Incremental Net Pro (IncNet Pro) with local learning feature and statistically controlled growing and pruning of the network is introduced. The architecture of the net is based on RBF networks. Extended Kalman Filter algorithm and its new fast version is proposed and used as learning algorithm. IncNet Pro is similar to the Resource Allocation Network described by Platt in the main idea of the expanding the network. The statistical novel criterion is used to determine the growing point. The Biradial functions are used instead of radial basis functions to obtain more flexible network. 1 Introduction The Radial Basis Function (RBF) networks [13,12] were designed as a solution to an approximation problem in multidimensional spaces. The typical form of the RBF network can be written as f(x; w;p) = M X i=1 w i G i (jjxjj i ; p i ) (1) where M is the number of the neurons in hidden layer, G i (jjxjj i ; p i ) is the i th Radial Basis Function, p i are adjustable parameters such as...
TraceBased Methods for Solving Nonlinear Global Optimization and Satisfiability Problems
 J. of Global Optimization
, 1996
"... . In this paper we present a method called NOVEL (Nonlinear Optimization via External Lead) for solving continuous and discrete global optimization problems. NOVEL addresses the balance between global search and local search, using a trace to aid in identifying promising regions before committing to ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
(Show Context)
. In this paper we present a method called NOVEL (Nonlinear Optimization via External Lead) for solving continuous and discrete global optimization problems. NOVEL addresses the balance between global search and local search, using a trace to aid in identifying promising regions before committing to local searches. We discuss NOVEL for solving continuous constrained optimization problems and show how it can be extended to solve constrained satisfaction and discrete satisfiability problems. We first transform the problem using Lagrange multipliers into an unconstrained version. Since a stable solution in a Lagrangian formulation only guarantees a local optimum satisfying the constraints, we propose a global search phase in which an aperiodic and bounded trace function is added to the search to first identify promising regions for local search. The trace generates an informationbearing trajectory from which good starting points are identified for further local searches. Taking only a sm...
Global Search Methods For Solving Nonlinear Optimization Problems
, 1997
"... ... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the lear ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the learning of feedforward neural networks, (b) the design of quadraturemirrorfilter digital filter banks, (c) the satisfiability problem, (d) the maximum satisfiability problem, and (e) the design of multiplierless quadraturemirrorfilter digital filter banks. Our method achieves better solutions than existing methods, or achieves solutions of the same quality but at a lower cost.
A Study of the Lamarckian Evolution of Recurrent Neural Networks
, 1999
"... Many frustrating experiences have been encountered when the training of neural networks by local search methods becomes stagnant at local optima. This calls for the development of more satisfactory search methods such as evolutionary search. However, training by evolutionary search can require a lon ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
Many frustrating experiences have been encountered when the training of neural networks by local search methods becomes stagnant at local optima. This calls for the development of more satisfactory search methods such as evolutionary search. However, training by evolutionary search can require a long computation time. In certain situations, using Lamarckian evolution, local search and evolutionary search can complement each other to yield a better training algorithm. This paper demonstrates the potential of this evolutionarylearning synergy by applying it to train recurrent neural networks in an attempt to resolve a longterm dependency problem and the inverted pendulum problem. This work also aims at investigating the interaction between local search and evolutionary search when they are combined. It is found that the combinations are particularly efficient when the local search is simple. In the case where no teacher signal is available for the local search to learn the desired task...
Unsupervised and Supervised Data Classification via Nonsmooth and Global Optimization
 Top
, 2003
"... We examine various methods for data clustering and data classification that are based on the minimization of the socalled cluster function and its modifications. These functions are nonsmooth and nonconvex. We use Discrete Gradient methods for their local minimization. We consider also a... ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
(Show Context)
We examine various methods for data clustering and data classification that are based on the minimization of the socalled cluster function and its modifications. These functions are nonsmooth and nonconvex. We use Discrete Gradient methods for their local minimization. We consider also a...
Computeraided diagnostic scheme for distinction between benign and malignant nodules in thoracic lowdose CT by use of massive training artificial neural network
 IEEE Transactions on Medical Imaging
, 2005
"... Abstract—Lowdose helical computed tomography (LDCT) is being applied as a modality for lung cancer screening. It may be difficult, however, for radiologists to distinguish malignant from benign nodules in LDCT. Our purpose in this study was to develop a computeraided diagnostic (CAD) scheme for di ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Lowdose helical computed tomography (LDCT) is being applied as a modality for lung cancer screening. It may be difficult, however, for radiologists to distinguish malignant from benign nodules in LDCT. Our purpose in this study was to develop a computeraided diagnostic (CAD) scheme for distinction between benign and malignant nodules in LDCT scans by use of a massive training artificial neural network (MTANN). The MTANN is a trainable, highly nonlinear filter based on an artificial neural network. To distinguish malignant nodules from six different types of benign nodules, we developed multiple MTANNs (multiMTANN) consisting of six expert MTANNs that are arranged in parallel. Each of the MTANNs was trained by use of input CT images and teaching images containing the estimate of the distribution for the “likelihood of being a malignant nodule,” i.e., the teaching image for a malignant nodule contains a twodi
Nonuniform Smallgain Theorems for Systems with Unstable Invariant Sets
, 2006
"... We consider the problem of asymptotic convergence to invariant sets in interconnected nonlinear dynamic systems. Standard approaches often require that the invariant sets be uniformly attracting. e.g. stable in the Lyapunov sense. This, however, is neither a necessary requirement, nor is it always u ..."
Abstract

Cited by 12 (9 self)
 Add to MetaCart
(Show Context)
We consider the problem of asymptotic convergence to invariant sets in interconnected nonlinear dynamic systems. Standard approaches often require that the invariant sets be uniformly attracting. e.g. stable in the Lyapunov sense. This, however, is neither a necessary requirement, nor is it always useful. Systems may, for instance, be inherently unstable (e.g. intermittent, itinerant, metastable) or the problem statement may include requirements that cannot be satisfied with stable solutions. This is often the case in general optimization problems and in nonlinear parameter identification or adaptation. Conventional techniques for these cases rely either on detailed knowledge of the system’s vectorfields or require boundeness of its states. The presently proposed method relies only on estimates of the inputoutput maps and steadystate characteristics. The method requires the possibility of representing the system as an interconnection of a stable, contracting, and an unstable, exploratory part. We illustrate with examples how the method can be applied to problems of analyzing the asymptotic behavior of locally unstable systems as well as to problems of parameter identification and adaptation in the presence of nonlinear parametrizations. The relation of our results to conventional smallgain theorems is discussed.
Evolving Modular Neural Networks Which Generalise Well
 Proceedings of the IEEE International Conference on Arti Life and Robotics. AROBIII
, 1997
"... In dealing with complex problems, a monolithic neural network often becomes too large and complex to design and manage. The only practical way is to design modular neural network systems consisting of simple modules. While there has been a lot of work on combining different modules in a modular syst ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
In dealing with complex problems, a monolithic neural network often becomes too large and complex to design and manage. The only practical way is to design modular neural network systems consisting of simple modules. While there has been a lot of work on combining different modules in a modular system in the fields of neural networks, statistics, and machine learning, little work has been done on how to design those modules automatically and how to exploit the interaction between individual module design and module combination. This paper proposes an evolutionary approach to designing modular neural networks. The approach addresses the issue of automatic determination of the number of individual modules and the exploitation of the interaction between individual module design and module combination. The relationship among different modules is considered during the module design. This is quite different from the conventional approach where the module design is separated from the module c...