Results 1  10
of
10
A Graduated Assignment Algorithm for Graph Matching
, 1996
"... A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational comp ..."
Abstract

Cited by 283 (15 self)
 Add to MetaCart
A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational complexity [O(lm), where l and m are the number of links in the two graphs] and robustness in the presence of noise offer advantages over traditional combinatorial approaches. The algorithm, not restricted to any special class of graph, is applied to subgraph isomorphism, weighted graph matching, and attributed relational graph matching. To illustrate the performance of the algorithm, attributed relational graphs derived from objects are matched. Then, results from twentyfive thousand experiments conducted on 100 node random graphs of varying types (graphs with only zeroone links, weighted graphs, and graphs with node attributes and multiple link types) are reported. No comparable results have...
New Algorithms for 2D and 3D Point Matching: Pose Estimation and Correspondence
"... A fundamental open problem in computer visiondetermining pose and correspondence between two sets of points in spaceis solved with a novel, fast [O(nm)], robust and easily implementable algorithm. The technique works on noisy 2D or 3D point sets that may be of unequal sizes and may differ by n ..."
Abstract

Cited by 85 (19 self)
 Add to MetaCart
A fundamental open problem in computer visiondetermining pose and correspondence between two sets of points in spaceis solved with a novel, fast [O(nm)], robust and easily implementable algorithm. The technique works on noisy 2D or 3D point sets that may be of unequal sizes and may differ by nonrigid transformations. Using a combination of optimization techniques such as deterministic annealing and the softassign, which have recently emerged out of the recurrent neural network/statistical physics framework, analog objective functions describing the problems are minimized. Over thirty thousand experiments, on randomly generated points sets with varying amounts of noise and missing and spurious points, and on handwritten character sets demonstrate the robustness of the algorithm. Keywords: Pointmatching, pose estimation, correspondence, neural networks, optimization, softassign, deterministic annealing, affine. 1 Introduction Matching the representations of two images has long...
A Lagrangian Relaxation Network for Graph Matching
 IEEE Trans. Neural Networks
, 1996
"... A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing. Our approach is in the same spirit as a Lagrangian decomposition approach in that the row and column constraints are satisfied separately with a Lagrange multiplier used to equate the two "solutions." Due to the unavoidable symmetries in graph isomorphism (resulting in multiple global minima), we add a symmetrybreaking selfamplification term in order to obtain a permutation matrix. With the application of a fixpoint preserving algebraic transformation to both the distance measure and selfamplification terms, we obtain a Lagrangian relaxation network. The network performs minimization with respect to the Lagrange parameters and maximization with respect to the permutation matrix variable...
Bayesian inference on visual grammars by neural nets that optimize
 YALE COMPUTER SCIENCE DEPARTMENT
, 1991
"... We exhibit a systematic way to derive neural nets for vision problems. It involves formulating a vision problem as Bayesian inference or decision on a comprehensive model of the visual domain given by a probabilistic grammar. A key feature of this grammar is the way in which it eliminates model inf ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
We exhibit a systematic way to derive neural nets for vision problems. It involves formulating a vision problem as Bayesian inference or decision on a comprehensive model of the visual domain given by a probabilistic grammar. A key feature of this grammar is the way in which it eliminates model information, such as object labels, as it produces an image; correspondence problems and other noise removal tasks result. The neural nets that arise most directly are generalized assignment networks. Also there are transformations which naturally yield improved algorithms such as correlation matching in scale space and the Frameville neural nets for highlevel vision. Networks derived this way generally have objective functions with spurious local minima; such minima may commonly be avoided by dynamics that include deterministic annealing, for example recent improvements to Mean Field Theory dynamics. The grammatical method of neural net design allows domain knowledge to enter from all levels of the grammar, including "abstract" levels remote from the final image data, and
Softmax to Softassign: Neural Network Algorithms for Combinatorial Optimization
 Journal of Artificial Neural Networks
, 1995
"... A new technique termed softassign is applied to three combinatorial optimization problems  weighted graph matching, the travelling salesman problem and graph partitioning. Softassign, which has emerged from the recurrent neural network/ statistical physics framework, enforces twoway (assignment) c ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
A new technique termed softassign is applied to three combinatorial optimization problems  weighted graph matching, the travelling salesman problem and graph partitioning. Softassign, which has emerged from the recurrent neural network/ statistical physics framework, enforces twoway (assignment) constraints without the use of penalty terms in the energy functions. The softassign can also be generalised from twoway winnertakeall constraints to multiple membership constraints which are required for graph partitioning. The softassign technique is compared to softmax (Potts glass) dynamics. Within the statistical physics framework, softmax and a penalty term has been a widely used method for enforcing the twoway constraints common to many combinatorial optimization problems. The benchmarks present evidence that softassign has clear advantages in accuracy, speed, parallelizability and algorithmic simplicity over softmax and a penalty term in optimization problems with twoway constraints.
Convergence Properties of the Softassign Quadratic Assignment Algorithm
 Neural Computation
, 1999
"... The softassign quadratic assignment algoithm is a discrete time, continuous state, synchronous updating optimizing neural network. While its effectiveness has been shown in the traveling salesman problem, graph matching and graph partitioning in thousands of simulations, there was no associated stud ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The softassign quadratic assignment algoithm is a discrete time, continuous state, synchronous updating optimizing neural network. While its effectiveness has been shown in the traveling salesman problem, graph matching and graph partitioning in thousands of simulations, there was no associated study of its convergence properties. Here, we construct discrete time Lyapunov functions for the cases of exact and approximate doubly stochastic constraint satisfaction which can be used to show convergence to a fixed point. The combination of good convergence properties and experimental success make the softassign algorithm the technique of choice for neural QAP optimization. 1 Introduction Discrete time optimizing neural networks are a well honed topic in neural computation. Beginning with the discrete state Hopfield model (Hopfield, 1982), considerable effort has been spent in analyzing the convergence properties of discrete time networks, especially along the dimensions of continuous versu...
Self Annealing: Unifying deterministic annealing and relaxation labeling
 In Energy Minimization Methods in Computer Vision and Pattern Recognition (EMMCVPR '97
, 1997
"... . Deterministic annealing and relaxation labeling algorithms for classification and matching are presented and discussed. A new approach self annealingis introduced to bring deterministic annealing and relaxation labeling into accord. Self annealing results in an emergent linear schedule for w ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
. Deterministic annealing and relaxation labeling algorithms for classification and matching are presented and discussed. A new approach self annealingis introduced to bring deterministic annealing and relaxation labeling into accord. Self annealing results in an emergent linear schedule for winnertakeall and assignment problems. Also, the relaxation labeling algorithm can be seen as an approximation to the self annealing algorithm for matching and labeling problems. 1 Introduction Labeling and matching problems abound in computer vision and pattern recognition (CVPR). It is not an exaggeration to state that some form or the other of the basic problems of template matching or data clustering has remained central to the CVPR and neural networks communities for about three decades. Due to the somewhat disparate natures of these communities, different frameworks for formulating and solving these two problems have emerged and it is not immediately obvious how to go about reconcili...
Symbolic neural networks derived from stochastic grammar domain models
 Connectionist Symbolic Integration. Lawrence Erlbaum Associates
, 1995
"... Starting with a statistical domain model in the form of a stochastic grammar, one can derive neural network architectures with some of the expressive power of a semantic network and also some of the pattern recognition and learning capabilities of more conventional neural networks. For example in th ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Starting with a statistical domain model in the form of a stochastic grammar, one can derive neural network architectures with some of the expressive power of a semantic network and also some of the pattern recognition and learning capabilities of more conventional neural networks. For example in this paper a version of the “Frameville ” architecture, and in particular its objective function and constraints, is derived from a stochastic grammar schema. Possible optimization dynamics for this architecture, and relationships to other recent architectures such as Bayesian networks and variablebinding networks, are also discussed. This paper outlines a statistical approach to unifying certain symbolic and neural net architectures, by deriving them from a stochastic domain model with sufficient structure. The domain model is a stochastic Lsystem grammar, whose rules for generating objects and their parts each include a Boltzmann probability distribution. Using such a domain model in highlevel vision, it is possible to formulate object recognition and visual learning problems as constrained optimization problems [16]
Self annealing and self annihilation: Unifying deterministic annealing and relaxation labeling
 In Pattern Recognition
, 2000
"... Deterministic annealing and relaxation labeling algorithms for classification and matching are presented and discussed. A new approachself annealingis introduced to bring deterministic annealing and relaxation labeling into accord. Self annealing results in an emergent linear schedule for winn ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Deterministic annealing and relaxation labeling algorithms for classification and matching are presented and discussed. A new approachself annealingis introduced to bring deterministic annealing and relaxation labeling into accord. Self annealing results in an emergent linear schedule for winnertakeall and linear assignment problems. Self annihilation, a generalization of self annealing is capable of performing the useful function of symmetry breaking. The original relaxation labeling algorithm is then shown to arise from an approximation to either the self annealing energy function or the corresponding dynamical system. With this relationship in place, self annihilation can be introduced into the relaxation labeling framework. Experimental results on synthetic matching and labeling problems clearly demonstrate the threeway relationship between deterministic annealing, relaxation labeling and self annealing. Keywords: Deterministic annealing, relaxation labeling, self anneal...
A Lagrange Multiplier and HopfieldType Barrier Function Method for the Traveling Salesman Problem
, 2001
"... A Lagrange multiplier and Hopfieldtype barrier function method is proposed for approximating a solution... ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A Lagrange multiplier and Hopfieldtype barrier function method is proposed for approximating a solution...