Results 1  10
of
23
A Graduated Assignment Algorithm for Graph Matching
, 1996
"... A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational comp ..."
Abstract

Cited by 285 (15 self)
 Add to MetaCart
A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational complexity [O(lm), where l and m are the number of links in the two graphs] and robustness in the presence of noise offer advantages over traditional combinatorial approaches. The algorithm, not restricted to any special class of graph, is applied to subgraph isomorphism, weighted graph matching, and attributed relational graph matching. To illustrate the performance of the algorithm, attributed relational graphs derived from objects are matched. Then, results from twentyfive thousand experiments conducted on 100 node random graphs of varying types (graphs with only zeroone links, weighted graphs, and graphs with node attributes and multiple link types) are reported. No comparable results have...
Mean Field Theory for Sigmoid Belief Networks
 Journal of Artificial Intelligence Research
, 1996
"... We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechanics. ..."
Abstract

Cited by 116 (12 self)
 Add to MetaCart
We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechanics.
New Algorithms for 2D and 3D Point Matching: Pose Estimation and Correspondence
"... A fundamental open problem in computer visiondetermining pose and correspondence between two sets of points in spaceis solved with a novel, fast [O(nm)], robust and easily implementable algorithm. The technique works on noisy 2D or 3D point sets that may be of unequal sizes and may differ by n ..."
Abstract

Cited by 85 (19 self)
 Add to MetaCart
A fundamental open problem in computer visiondetermining pose and correspondence between two sets of points in spaceis solved with a novel, fast [O(nm)], robust and easily implementable algorithm. The technique works on noisy 2D or 3D point sets that may be of unequal sizes and may differ by nonrigid transformations. Using a combination of optimization techniques such as deterministic annealing and the softassign, which have recently emerged out of the recurrent neural network/statistical physics framework, analog objective functions describing the problems are minimized. Over thirty thousand experiments, on randomly generated points sets with varying amounts of noise and missing and spurious points, and on handwritten character sets demonstrate the robustness of the algorithm. Keywords: Pointmatching, pose estimation, correspondence, neural networks, optimization, softassign, deterministic annealing, affine. 1 Introduction Matching the representations of two images has long...
The concaveconvex procedure (CCCP)
, 2003
"... The ConcaveConvex procedure (CCCP) is a way to construct discrete time iterative dynamical systems which are guaranteed to monotonically decrease global optimization/energy functions. This procedure can be applied to almost any optimization problem and many existing algorithms can be interpreted ..."
Abstract

Cited by 46 (5 self)
 Add to MetaCart
The ConcaveConvex procedure (CCCP) is a way to construct discrete time iterative dynamical systems which are guaranteed to monotonically decrease global optimization/energy functions. This procedure can be applied to almost any optimization problem and many existing algorithms can be interpreted in terms of it. In particular, we prove that all EM algorithms and classes of Legendre minimization and variational bounding algorithms can be reexpressed in terms of CCCP. We show that many existing neural network and mean field theory algorithms are also examples of CCCP. The Generalized Iterative Scaling (GIS) algorithm and Sinkhorn’s algorithm can also be expressed as CCCP by changing variables. CCCP can be used both as a new way to understand, and prove convergence of, existing optimization algorithms and as a procedure for generating new algorithms.
A Robust Point Matching Algorithm for Autoradiograph Alignment
, 1997
"... We present a novel method for the geometric alignment of autoradiographs of the brain. The method is based on finding the spatial mapping and the onetoone correspondences (or homologies) between point features extracted from the images and rejecting nonhomologies as outliers. In this way, we atte ..."
Abstract

Cited by 38 (12 self)
 Add to MetaCart
We present a novel method for the geometric alignment of autoradiographs of the brain. The method is based on finding the spatial mapping and the onetoone correspondences (or homologies) between point features extracted from the images and rejecting nonhomologies as outliers. In this way, we attempt to account for the local natural and artifactual differences between the autoradiograph slices. We have executed the resulting automated algorithm on a set of left prefrontal cortex autoradiograph slices, specifically demonstrated its ability to perform point outlier rejection, validated it using synthetically generated spatial mappings and provided a visual comparison against the well known iterated closest point (ICP) algorithm. Visualization of a stack of aligned left prefrontal cortex autoradiograph slices is also provided.
A Novel Optimizing Network Architecture with Applications
 Neural Computation
, 1996
"... We present a novel optimizing network architecture with applications in vision, learning, pattern recognition and combinatorial optimization. This architecture is constructed by combining the following techniques: (i) deterministic annealing, (ii) selfamplification, (iii) algebraic transformations, ..."
Abstract

Cited by 35 (16 self)
 Add to MetaCart
We present a novel optimizing network architecture with applications in vision, learning, pattern recognition and combinatorial optimization. This architecture is constructed by combining the following techniques: (i) deterministic annealing, (ii) selfamplification, (iii) algebraic transformations, (iv) clocked objectives and (v) softassign. Deterministic annealing in conjunction with selfamplification avoids poor local minima and ensures that a vertex of the hypercube is reached. Algebraic transformations and clocked objectives help partition the relaxation into distinct phases. The problems considered have doubly stochastic matrix constraints or minor variations thereof. We introduce a new technique, softassign, which is used to satisfy this constraint. Experimental results on different problems are presented and discussed. 1
Rigid Point Feature Registration Using Mutual Information
, 1999
"... We have developed a new mutual informationbased registration method for matching unlabeled point features. In contrast to earlier mutual informationbased registration methods which estimate the mutual information using image intensity information, our approach uses the point feature location infor ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
We have developed a new mutual informationbased registration method for matching unlabeled point features. In contrast to earlier mutual informationbased registration methods which estimate the mutual information using image intensity information, our approach uses the point feature location information. A novel aspect of our approach is the emergence of correspondence (between the two sets of features) as a natural byproduct of joint density estimation. We have applied this algorithm to the problem of geometric alignment of primate autoradiographs. We also present preliminary results on 3D robust matching of sulci derived from anatomical MR. Finally, we present an experimental comparison between the mutual information approach and other recent approaches which explicitly parameterize feature correspondence. Keywords: point feature registration, rigid alignment, mutual information, similarity transformation, spatial mapping, correspondence, joint probability, softassign Received ?...
A Lagrangian Relaxation Network for Graph Matching
 IEEE Trans. Neural Networks
, 1996
"... A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing. Our approach is in the same spirit as a Lagrangian decomposition approach in that the row and column constraints are satisfied separately with a Lagrange multiplier used to equate the two "solutions." Due to the unavoidable symmetries in graph isomorphism (resulting in multiple global minima), we add a symmetrybreaking selfamplification term in order to obtain a permutation matrix. With the application of a fixpoint preserving algebraic transformation to both the distance measure and selfamplification terms, we obtain a Lagrangian relaxation network. The network performs minimization with respect to the Lagrange parameters and maximization with respect to the permutation matrix variable...
Learning With Preknowledge: Clustering With Point and Graph Matching Distance Measures
 Neural Computation
, 1996
"... Prior knowledge constraints are imposed upon a learning problem in the form of distance measures. Prototypical 2D point sets and graphs are learned by clustering with point matching and graph matching distance measures. The point matching distance measure is approx. invariant under affine transform ..."
Abstract

Cited by 26 (9 self)
 Add to MetaCart
Prior knowledge constraints are imposed upon a learning problem in the form of distance measures. Prototypical 2D point sets and graphs are learned by clustering with point matching and graph matching distance measures. The point matching distance measure is approx. invariant under affine transformationstranslation, rotation, scale and shearand permutations. It operates between noisy images with missing and spurious points. The graph matching distance measure operates on weighted graphs and is invariant under permutations. Learning is formulated as an optimization problem. Large objectives so formulated (¸ million variables) are efficiently minimized using a combination of optimization techniquessoftassign, algebraic transformations, clocked objectives, and deterministic annealing. 1 Introduction While few biologists today would subscribe to Locke's description of the nascent mind as a tabula rasa, the nature of the inherent constraintsKant's preknowledgethat helps org...
Bayesian inference on visual grammars by neural nets that optimize
 YALE COMPUTER SCIENCE DEPARTMENT
, 1991
"... We exhibit a systematic way to derive neural nets for vision problems. It involves formulating a vision problem as Bayesian inference or decision on a comprehensive model of the visual domain given by a probabilistic grammar. A key feature of this grammar is the way in which it eliminates model inf ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
We exhibit a systematic way to derive neural nets for vision problems. It involves formulating a vision problem as Bayesian inference or decision on a comprehensive model of the visual domain given by a probabilistic grammar. A key feature of this grammar is the way in which it eliminates model information, such as object labels, as it produces an image; correspondence problems and other noise removal tasks result. The neural nets that arise most directly are generalized assignment networks. Also there are transformations which naturally yield improved algorithms such as correlation matching in scale space and the Frameville neural nets for highlevel vision. Networks derived this way generally have objective functions with spurious local minima; such minima may commonly be avoided by dynamics that include deterministic annealing, for example recent improvements to Mean Field Theory dynamics. The grammatical method of neural net design allows domain knowledge to enter from all levels of the grammar, including "abstract" levels remote from the final image data, and