Results

**1 - 2**of**2**### Hopfield Network as Static Optimizer: Learning the Weights and Eliminating the Guesswork

"... Abstract This article presents a simulation study for validation of an adaptation methodology for learning weights of a Hopfield neural network configured as a static optimizer. The quadratic Liapunov function associated with the Hopfield network dynamics is leveraged to map the set of constraints a ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract This article presents a simulation study for validation of an adaptation methodology for learning weights of a Hopfield neural network configured as a static optimizer. The quadratic Liapunov function associated with the Hopfield network dynamics is leveraged to map the set of constraints associated with a static optimization problem. This approach leads to a set of constraint-specific penalty or weighting coefficients whose values need to be defined. The methodology leverages a learning-based approach to define values of constraint weighting coefficients through adaptation. These values are in turn used to compute values of network weights, effectively eliminating the guesswork in defining weight values for a given static optimization problem, which has been a long-standing challenge in artificial neural networks. The simulation study is performed using the Traveling Salesman problem from the domain of combinatorial optimization. Simulation results indicate that the adaptation procedure is able to guide the Hopfield network towards solutions of the problem starting with random values for weights and constraint weighting coefficients. At the conclusion of the adaptation phase, the Hopfield network acquires weight values which readily position the network to search for local minimum solutions. The demonstrated successful application of the adaptation procedure eliminates the need to guess or predetermine the values for weights of the Hopfield network.

### ADAPTATION IN WEIGHT SPACE THROUGH GRADIENT DESCENT FOR HOPFIELD NET AS STATIC OPTIMIZER: IS IT FEASIBLE?

"... This paper reports on results of an empirical simulation study for adaptation of weights through gradient descent for a Hopfield neural network configured as a static optimizer and tested on the traveling salesman problem. Adaptation through gradient descent within the context of recurrent and non-r ..."

Abstract
- Add to MetaCart

(Show Context)
This paper reports on results of an empirical simulation study for adaptation of weights through gradient descent for a Hopfield neural network configured as a static optimizer and tested on the traveling salesman problem. Adaptation through gradient descent within the context of recurrent and non-recurrent back-propagation training was attempted in the weight space, which is highdimensional, i.e. on the order of 1,000,000,000,000 weights for a twodimensional node array of the Hopfield network configured for a 1000-city problem instance. Ensuing substantial empirical work, practically no noteworthy progress could be recorded for realization of the adaptation in the weight space: the adaptation algorithm failed to locate a set of weight values that established the solutions of the traveling salesman problem instances as local minima in the Lyapunov space. Accordingly, the findings in this paper tend to suggest that alternate means of adaptation schemes with a small number of freely adjustable parameters should be considered.