Results 1  10
of
25
Growing Cell Structures  A Selforganizing Network for Unsupervised and Supervised Learning
 Neural Networks
, 1993
"... We present a new selforganizing neural network model having two variants. The first variant performs unsupervised learning and can be used for data visualization, clustering, and vector quantization. The main advantage over existing approaches, e.g., the Kohonen feature map, is the ability of the m ..."
Abstract

Cited by 275 (11 self)
 Add to MetaCart
We present a new selforganizing neural network model having two variants. The first variant performs unsupervised learning and can be used for data visualization, clustering, and vector quantization. The main advantage over existing approaches, e.g., the Kohonen feature map, is the ability of the model to automatically find a suitable network structure and size. This is achieved through a controlled growth process which also includes occasional removal of units. The second variant of the model is a supervised learning method which results from the combination of the abovementioned selforganizing network with the radial basis function (RBF) approach. In this model it is possible  in contrast to earlier approaches  to perform the positioning of the RBF units and the supervised training of the weights in parallel. Therefore, the current classification error can be used to determine where to insert new RBF units. This leads to small networks which generalize very well. Results on the t...
SelfOrganizing Maps: Ordering, Convergence Properties and Energy Functions
 Biological Cybernetics
, 1992
"... We investigate the convergence properties of the selforganizing feature map algorithm for a simple, but very instructive case: the formation of a topographic representation of the unit interval [0; 1] by a linear chain of neurons. We extend the proofs of convergence of Kohonen and of Cottrell and F ..."
Abstract

Cited by 104 (2 self)
 Add to MetaCart
(Show Context)
We investigate the convergence properties of the selforganizing feature map algorithm for a simple, but very instructive case: the formation of a topographic representation of the unit interval [0; 1] by a linear chain of neurons. We extend the proofs of convergence of Kohonen and of Cottrell and Fort to hold in any case where the neighborhood function, which is used to scale the change in the weight values at each neuron, is a monotonically decreasing function of distance from the winner neuron. We prove that the learning dynamics cannot be described by a gradient descent on a single energy function, but may be described using a set of potential functions, one for each neuron, which are independently minimized following a stochastic gradient descent. We derive the correct potential functions for the one and multidimensional case, and show that the energy functions given by Tolat (1990) are an approximation which is no longer valid in the case of highly disordered maps or steep neig...
Neural Networks for Combinatorial Optimization: A Review of More Than a Decade of Research
, 1999
"... This article briefly summarizes the work that has been done and presents the current standing of neural networks for combinatorial optimization by considering each of the major classes of combinatorial optimization problems. Areas which have not yet been studied are identified for future research. ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
This article briefly summarizes the work that has been done and presents the current standing of neural networks for combinatorial optimization by considering each of the major classes of combinatorial optimization problems. Areas which have not yet been studied are identified for future research.
The Traveling Salesman Problem: A Neural Network Perspective
"... This paper surveys the "neurally" inspired problemsolving approaches to the traveling salesman problem, namely, the HopfieldTank network, the elastic net, and the selforganizing map. The latest achievements in the neural network domain are reported and numerical comparisons are provided ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
This paper surveys the "neurally" inspired problemsolving approaches to the traveling salesman problem, namely, the HopfieldTank network, the elastic net, and the selforganizing map. The latest achievements in the neural network domain are reported and numerical comparisons are provided with the classical solution approaches of operations research. An extensive bibliography with more than one hundred references is also included.
Fast SelfOrganizing Feature Map Algorithm
, 2000
"... We present an efficient approach to forming feature maps. The method involves three stages. In the first stage, we use themeans algorithm to select (i.e., the size of the feature map to be formed) cluster centers from a data set. Then a heuristic assignment strategy is employed to organize the ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
We present an efficient approach to forming feature maps. The method involves three stages. In the first stage, we use themeans algorithm to select (i.e., the size of the feature map to be formed) cluster centers from a data set. Then a heuristic assignment strategy is employed to organize the selected data points into an neural array so as to form an initial feature map. If the initial map is not good enough, then it will be finetuned by the traditional Kohonen selforganizing feature map (SOM) algorithm under a fast cooling regime in the third stage. By our threestage method, a topologically ordered feature mapwould be formed very quickly instead of requiring a huge amount of iterations to finetune the weights toward the density distribution of the data points, which usually happened in the conventional SOM algorithm. Three data sets are utilized to illustrate the proposed method.
Neural techniques for combinatorial optimization with applications
 IEEE Transactions on Neural Networks
, 1998
"... Abstract — After more than a decade of research, there now exist several neuralnetwork techniques for solving NPhard combinatorial optimization problems. Hopfield networks and selforganizing maps are the two main categories into which most of the approaches can be divided. Criticism of these appro ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
(Show Context)
Abstract — After more than a decade of research, there now exist several neuralnetwork techniques for solving NPhard combinatorial optimization problems. Hopfield networks and selforganizing maps are the two main categories into which most of the approaches can be divided. Criticism of these approaches includes the tendency of the Hopfield network to produce infeasible solutions, and the lack of generalizability of the selforganizing approaches (being only applicable to Euclidean problems). This paper proposes two new techniques which have overcome these pitfalls: a Hopfield network which enables feasibility of the solutions to be ensured and improved solution quality through escape from local minima, and a selforganizing neural network which generalizes to solve a broad class of combinatorial optimization problems. Two sample practical optimization problems from Australian industry are then used to test the performances of the neural techniques against more traditional heuristic solutions. Index Terms—Assembly line, combinatorial optimization, Hopfield networks, hub location, NPhard, selforganization, sequencing,
Neural Networks in Business: Techniques and Applications for the Operations Researcher
, 2000
"... This paper presents an overview of the di!erent types of neural network models which are applicable when solving business problems. The history of neural networks in business is outlined, leading to a discussion of the current applications in business including data mining, as well as the current re ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
This paper presents an overview of the di!erent types of neural network models which are applicable when solving business problems. The history of neural networks in business is outlined, leading to a discussion of the current applications in business including data mining, as well as the current research directions. The role of neural networks as a modern operations research tool is discussed. Scope and purpose Neural networks are becoming increasingly popular in business. Many organisations are investing in neural network and data mining solutions to problems which have traditionally fallen under the responsibility of operations research. This article provides an overview for the operations research reader of the basic neural network techniques, as well as their historical and current use in business. The paper is intended as an introductory article for the remainder of this special issue on neural networks in business. # 2000 Elsevier Science Ltd. All rights reserved. Keywords: N...
Applications of the selforganising map to reinforcement learning
 Neural Networks
, 2002
"... Running Title: Applying the SOM to reinforcement learning ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
Running Title: Applying the SOM to reinforcement learning
A selforganizing neural network for the traveling salesman problem that is competitive with simulated annealing
 Neural Computation
, 1996
"... Abstract We present and analyze a Self Organizing Feature Map (SOFM) for the NPcomplete problem of the travelling salesman (TSP): finding the shortest closed path joining N cities. Since the SOFM has discrete input patterns (the cities of the TSP) one can examine its dynamics analytically. We show ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
Abstract We present and analyze a Self Organizing Feature Map (SOFM) for the NPcomplete problem of the travelling salesman (TSP): finding the shortest closed path joining N cities. Since the SOFM has discrete input patterns (the cities of the TSP) one can examine its dynamics analytically. We show that, with a particular choice of the distance function for the net, the energy associated to the SOFM has its absolute minimum at the shortest TSP path. Numerical simulations confirm that this distance augments performances. It is curious that the distance function having this property combines the distances of the neuron and of the weight spaces. Solving difficult problems is a natural arena for a wouldbe new calculus paradigm like that of neural networks. One can delineate a sharper image of their potential with respect to the blurred image obtained in simpler problems. Here we tackle the Travelling Salesman Problem (TSP, see [Lawler 1985],
Problem Solving with Optimization Networks
, 1993
"... previously seemed, since they can be successfully applied to only a limited number of problems exhibiting special, amenable properties. Combinatorial optimization, neural networks, mean eld annealing. i optimization networks Key words: Summary I am greatly indebted to my supervisor, Richard Prager, ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
previously seemed, since they can be successfully applied to only a limited number of problems exhibiting special, amenable properties. Combinatorial optimization, neural networks, mean eld annealing. i optimization networks Key words: Summary I am greatly indebted to my supervisor, Richard Prager, for initially allowing me the freedom to explore various research areas, and subsequently providing invaluable support as my work progressed. Members of the Speech, Vision and Robotics Group at the Cambridge University Department of Engineering have provided a stimulating and friendly environment to work in: special thanks must go to Patrick Gosling and Tony Robinson for maintaining a superb computing service, and to Sree Aiyer for both setting me on the right course and for numerous helpful discussions since then. I would like to thank the Science and Engineering Research Council of Great Britain, the Cambridge University Department of Engineering and Queen