Results 1  10
of
14
Neural Optimization
 The Handbook of Brain Research and Neural Networks. Bradford Books/The
, 1998
"... Introduction Many combinatorial optimization problems require a more or less exhaustive search to achieve exact solutions, with the computational effort growing exponentially or worse with system size. Various kinds of heuristic methods are therefore often used to find reasonably good solutions. Th ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
Introduction Many combinatorial optimization problems require a more or less exhaustive search to achieve exact solutions, with the computational effort growing exponentially or worse with system size. Various kinds of heuristic methods are therefore often used to find reasonably good solutions. The artificial neural network (ANN) approach falls within this category. In contrast to most other methods, the ANN approach does not fully or partly explore the discrete statespace. Rather, it "feels" its way in a fuzzy manner through an interpolating, continuous space towards good solutions, and allows for a probabilistic interpretation. Key elements in this approach are the meanfield (MF) approximation (Hopfield and Tank, 1985; Peterson and S¨oderberg, 1989), annealing, and for many problems the Potts formulation (Peterson and S¨oderberg, 1989). Recently, also propagator methods have proven most valuable for handling
Algebraic Transformations of Objective Functions
 Neural Networks
, 1994
"... Many neural networks can be derived as optimization dynamics for suitable objective functions. We show that such networks can be designed by repeated transformations of one objective into another with the same fixpoints. We exhibit a collection of algebraic transformations which reduce network cost ..."
Abstract

Cited by 26 (11 self)
 Add to MetaCart
Many neural networks can be derived as optimization dynamics for suitable objective functions. We show that such networks can be designed by repeated transformations of one objective into another with the same fixpoints. We exhibit a collection of algebraic transformations which reduce network cost and increase the set of objective functions that are neurally implementable. The transformations include simplification of products of expressions, functions of one or two expressions, and sparse matrix products (all of which may be interpreted as Legendre transformations); also the minimum and maximum of a set of expressions. These transformations introduce new interneurons which force the network to seek a saddle point rather than a minimum. Other transformations allow control of the network dynamics, by reconciling the Lagrangian formalism with the need for fixpoints. We apply the transformations to simplify a number of structured neural networks, beginning with the standard reduction of...
Toward 3D Vision from Range Images: An Optimization Framework and Parallel Networks
"... We propose a unified approach to solve low, intermediate and high level computer vision problems for 3D object recognition from range images. All three levels of computation are cast in an optimization framework and can be implemented on neural network style architecture. In the low level computatio ..."
Abstract

Cited by 15 (10 self)
 Add to MetaCart
We propose a unified approach to solve low, intermediate and high level computer vision problems for 3D object recognition from range images. All three levels of computation are cast in an optimization framework and can be implemented on neural network style architecture. In the low level computation, the tasks are to estimate curvature images from the input range data. Subsequent processing at the intermediate level is concerned with segmenting these curvature images into coherent curvature sign maps. In the high level, image features are matched against model features based on an object description called attributed relational graph (ARG). We show that the above computational tasks at each of the three different levels can all be formulated as optimizing a twoterm energy function. The first term encodes unary constraints while the second term binary ones. These energy functions are minimized using parallel and distributed relaxationbased algorithms which are well suited for neural...
Improving Convergence and Solution Quality of HopfieldType Neural Networks with Augmented Lagrange Multipliers
 IEEE Transactions On Neural Networks
, 1996
"... Hopfieldtype networks convert a combinatorial optimization to a constrained real optimization and solve the latter using the penalty method. There is a dilemma with such networks: When tuned to produce good quality solutions, they can fail to converge to valid solutions; when tuned to converge, the ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Hopfieldtype networks convert a combinatorial optimization to a constrained real optimization and solve the latter using the penalty method. There is a dilemma with such networks: When tuned to produce good quality solutions, they can fail to converge to valid solutions; when tuned to converge, they tend to give low quality solutions. This paper proposes a new method, called the Augmented LagrangeHopfield (ALH) method, to improve Hopfieldtype neural networks in both the convergence and the solution quality in solving combinatorial optimization. It uses the augmented Lagrange method, which combines both the Lagrange and the penalty methods, to effectively solve the dilemma. Experimental results on the TSP show superiority of the ALH method over the existing Hopfieldtype neural networks in the convergence and solution quality. For the 10city TSP's, ALH finds the known optimal tour with 100% success rate, as the result of 1000 runs with different random initializations. For larger si...
A Lagrangian Reconstruction of a Class of Local Search Methods
 IN PROC. 10TH INT'L CONF. ON ARTIFICIAL INTELLIGENCE TOOLS. IEEE COMPUTER SOCIETY
, 1998
"... Heuristic repair algorithms, a class of local search methods, demonstrate impressive efficiency in solving some largescale and hard instances of constraint satisfaction problems (CSP's). In this paper, we draw a surprising connection between heuristic repair techniques and the discrete Lagrang ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Heuristic repair algorithms, a class of local search methods, demonstrate impressive efficiency in solving some largescale and hard instances of constraint satisfaction problems (CSP's). In this paper, we draw a surprising connection between heuristic repair techniques and the discrete Lagrange multiplier methods by transforming CSP's into zeroone constrained optimization problems. A Lagrangianbased search scheme LSDL is proposed. We show how GENET, a representative heuristic repair algorithm, can be reconstructed from LSDL. The dual viewpoint of GENET as heuristic repair method and Lagrange multiplier method allows us to investigate variants of GENET from both perspectives. Benchmarking results confirm that first, our reconstructed GENET has the same fast convergence behavior as other GENET implementations reported in the literature, competing favourably with other stateoftheart methods on a set of hard graph colouring problems. Second, our best variant, which combines technique...
Bayesian Image Restoration And Segmentation By Constrained Optimization
 IEEE Transactions on Image Processing
, 1996
"... A constrained optimization method, called the LagrangeHopfield (LH) method, is presented for solving Markov random field (MRF) based Bayesian image estimation problems for restoration and segmentation. The method combines the augmented Lagrangian multiplier technique with the Hopfield network to so ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
A constrained optimization method, called the LagrangeHopfield (LH) method, is presented for solving Markov random field (MRF) based Bayesian image estimation problems for restoration and segmentation. The method combines the augmented Lagrangian multiplier technique with the Hopfield network to solve a constrained optimization problem into which the original Bayesian estimation problem is reformulated. The LH method effectively overcomes instabilities that are inherent in the penalty method (e.g. Hopfield network) or the Lagrange multiplier method in constrained optimization. An additional advantage of the LH method is its suitability for neurallike analog implementation. Experimental results are presented which show that LH yields good quality solutions at reasonable computational costs. 1. INTRODUCTION Image restoration is to recover a degraded image and segmentation is to partition an image into regions of similar image properties. Both can be posed generally as image estimation...
Minimax and Hamiltonian dynamics of excitatoryinhibitory networks
 Advances in Neural Information Processing Systems 10
, 1998
"... ALyapunov function for excitatoryinhibitory networks is constructed. The construction assumes symmetric interactions within excitatory and inhibitory populations of neurons, and antisymmetric interactions between populations. The Lyapunov function yields su cient conditions for the global asymptoti ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
ALyapunov function for excitatoryinhibitory networks is constructed. The construction assumes symmetric interactions within excitatory and inhibitory populations of neurons, and antisymmetric interactions between populations. The Lyapunov function yields su cient conditions for the global asymptotic stability of xed points. If these conditions are violated, limit cycles may be stable. The relations of the Lyapunov function to optimization theory and classical mechanics are revealed by minimax and dissipative Hamiltonian forms of the network dynamics. The dynamics of a neural network with symmetric interactions provably converges to xed points under very general assumptions[1, 2]. This mathematical result helped to establish the paradigm of neural computation with xed point attractors[3]. But in reality, interactions between neurons in the brain are asymmetric. Furthermore, the dynamical behaviors seen in the brain are not con ned to xed point attractors, but also include oscillations and complex nonperiodic behavior. These other types
A Lagrangian reconstruction of GENET
, 2000
"... GENET is a heuristic repair algorithm which demonstrates impressive efficiency in solving some largescale and hard instances of constraint satisfaction problems (CSPs). In this paper, we draw a surprising connection between GENET and discrete Lagrange multiplier methods. Based on the work of Wah an ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
GENET is a heuristic repair algorithm which demonstrates impressive efficiency in solving some largescale and hard instances of constraint satisfaction problems (CSPs). In this paper, we draw a surprising connection between GENET and discrete Lagrange multiplier methods. Based on the work of Wah and Shang, we propose a discrete Lagrangianbased search scheme LSDL, defining a class of search algorithms for solving CSPs. We show how GENET can be reconstructed from LSDL.The dual viewpoint of GENET as a heuristic repair method and a discrete Lagrange multiplier method allows us to investigate variants of GENET from both perspectives. Benchmarking results confirm that first, our reconstructed GENET has the same fast convergence behavior as the original GENET implementation, and has competitive performance with other local search solvers DLM, WalkSAT, and WSAT(OIP), on a set of difficult benchmark problems. Second, our improved variant, which combines techniques from heuristic repair an...
A neural network approach to routing without interference in multihop radio networks
 IEEE Trans. on Communications Vol
, 1994
"... ..."
Relaxation Labeling Using Augmented LagrangeHopfield Method
 Pattern Recognition
, 1998
"... This paper presents a novel relaxation labeling method called Augmented LagrangianHopfield (ALH) method based on the Augmented Lagrangian multipliers and the graded Hopfield neural network. In the ALH method, RL is formulated as a problem of constrained real optimization. The augmented Lagrange mul ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This paper presents a novel relaxation labeling method called Augmented LagrangianHopfield (ALH) method based on the Augmented Lagrangian multipliers and the graded Hopfield neural network. In the ALH method, RL is formulated as a problem of constrained real optimization. The augmented Lagrange multiplier method [13,14] is used for optimization with the constraints and the Hopfield method [15,16] for bridging the gap between discrete and continuous optimization. The ALH needs no gradient projection nor other normalization operations in its updating equations in keeping the labeling constraints. Therefore, it is more amenable for a neural network implementation than the exiting RL algorithms. Experiments show that the ALH produces good quality solutions in terms of the optimized objective values at a reasonable number of iterations. A recent result shows that the ALH method significantly improves the Hopfield type networks in solving the traveling salesman problem [17]. The ALH has also been used for image restoration and segmentation [18]. The rest of the paper is organized as follows: Section 2 introduces the continuous RL Method. Section 3 poses RL as a constrained optimization problem and presents the ALH method for solving it. Section 4 discusses the constrained optimization methods in connection to RL. Section 5 gives a neural network structure for the ALH computation. Section 6 presents the experimental results.