Results 1  10
of
15
A Graduated Assignment Algorithm for Graph Matching
, 1996
"... A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational comp ..."
Abstract

Cited by 282 (15 self)
 Add to MetaCart
A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational complexity [O(lm), where l and m are the number of links in the two graphs] and robustness in the presence of noise offer advantages over traditional combinatorial approaches. The algorithm, not restricted to any special class of graph, is applied to subgraph isomorphism, weighted graph matching, and attributed relational graph matching. To illustrate the performance of the algorithm, attributed relational graphs derived from objects are matched. Then, results from twentyfive thousand experiments conducted on 100 node random graphs of varying types (graphs with only zeroone links, weighted graphs, and graphs with node attributes and multiple link types) are reported. No comparable results have...
Shock Graphs and Shape Matching
, 1998
"... We have been developing a theory for the generic representation of 2D shape, where structural descriptions are derived from the shocks (singularities) of a curve evolution process, acting on bounding contours. We now apply the theory to the problem of shape matching. The shocks are organized into a ..."
Abstract

Cited by 203 (32 self)
 Add to MetaCart
We have been developing a theory for the generic representation of 2D shape, where structural descriptions are derived from the shocks (singularities) of a curve evolution process, acting on bounding contours. We now apply the theory to the problem of shape matching. The shocks are organized into a directed, acyclic shock graph, and complexity is managed by attending to the most significant (central) shape components first. The space of all such graphs is highly structured and can be characterized by the rules of a shock graph grammar. The grammar permits a reduction of a shock graph to a unique rooted shock tree. We introduce a novel tree matching algorithm which finds the best set of corresponding nodes between two shock trees in polynomial time. Using a diverse database of shapes, we demonstrate our system's performance under articulation, occlusion, and changes in viewpoint. Keywords: shape representation; shape matching; shock graph; shock graph grammar; subgraph isomorphism. 1 I...
Replicator Equations, Maximal Cliques, and Graph Isomorphism
, 1999
"... We present a new energyminimization framework for the graph isomorphism problem that is based on an equivalent maximum clique formulation. The approach is centered around a fundamental result proved by Motzkin and Straus in the mid1960s, and recently expanded in various ways, which allows us to fo ..."
Abstract

Cited by 52 (11 self)
 Add to MetaCart
We present a new energyminimization framework for the graph isomorphism problem that is based on an equivalent maximum clique formulation. The approach is centered around a fundamental result proved by Motzkin and Straus in the mid1960s, and recently expanded in various ways, which allows us to formulate the maximum clique problem in terms of a standard quadratic program. The attractive feature of this formulation is that a clear onetoone correspondence exists between the solutions of the quadratic program and those in the original, combinatorial problem. To solve the program we use the socalled replicator equations—a class of straightforward continuous and discretetime dynamical systems developed in various branches of theoretical biology. We show how, despite their inherent inability to escape from local solutions, they nevertheless provide experimental results that are competitive with those obtained using more elaborate meanfield annealing heuristics.
ViewBased Object Recognition Using Saliency Maps
, 1998
"... We introduce a novel viewbased object representation, called the saliency map graph (SMG), which captures the salient regions of an object view at multiple scales using a wavelet transform. This compact representation is highly invariant to translation, rotation (image and depth), and scaling, and ..."
Abstract

Cited by 48 (8 self)
 Add to MetaCart
We introduce a novel viewbased object representation, called the saliency map graph (SMG), which captures the salient regions of an object view at multiple scales using a wavelet transform. This compact representation is highly invariant to translation, rotation (image and depth), and scaling, and offers the locality of representation required for occluded object recognition. To compare two saliency map graphs, we introduce two graph similarity algorithms. The first computes the topological similarity between two SMG's, providing a coarselevel matching of two graphs. The second computes the geometrical similarity between two SMG's, providing a finelevel matching of two graphs. We test and compare these two algorithms on a large database of model object views.
A Novel Optimizing Network Architecture with Applications
 Neural Computation
, 1996
"... We present a novel optimizing network architecture with applications in vision, learning, pattern recognition and combinatorial optimization. This architecture is constructed by combining the following techniques: (i) deterministic annealing, (ii) selfamplification, (iii) algebraic transformations, ..."
Abstract

Cited by 35 (16 self)
 Add to MetaCart
We present a novel optimizing network architecture with applications in vision, learning, pattern recognition and combinatorial optimization. This architecture is constructed by combining the following techniques: (i) deterministic annealing, (ii) selfamplification, (iii) algebraic transformations, (iv) clocked objectives and (v) softassign. Deterministic annealing in conjunction with selfamplification avoids poor local minima and ensures that a vertex of the hypercube is reached. Algebraic transformations and clocked objectives help partition the relaxation into distinct phases. The problems considered have doubly stochastic matrix constraints or minor variations thereof. We introduce a new technique, softassign, which is used to satisfy this constraint. Experimental results on different problems are presented and discussed. 1
A Lagrangian Relaxation Network for Graph Matching
 IEEE Trans. Neural Networks
, 1996
"... A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing. Our approach is in the same spirit as a Lagrangian decomposition approach in that the row and column constraints are satisfied separately with a Lagrange multiplier used to equate the two "solutions." Due to the unavoidable symmetries in graph isomorphism (resulting in multiple global minima), we add a symmetrybreaking selfamplification term in order to obtain a permutation matrix. With the application of a fixpoint preserving algebraic transformation to both the distance measure and selfamplification terms, we obtain a Lagrangian relaxation network. The network performs minimization with respect to the Lagrange parameters and maximization with respect to the permutation matrix variable...
Algebraic Transformations of Objective Functions
 Neural Networks
, 1994
"... Many neural networks can be derived as optimization dynamics for suitable objective functions. We show that such networks can be designed by repeated transformations of one objective into another with the same fixpoints. We exhibit a collection of algebraic transformations which reduce network cost ..."
Abstract

Cited by 26 (11 self)
 Add to MetaCart
Many neural networks can be derived as optimization dynamics for suitable objective functions. We show that such networks can be designed by repeated transformations of one objective into another with the same fixpoints. We exhibit a collection of algebraic transformations which reduce network cost and increase the set of objective functions that are neurally implementable. The transformations include simplification of products of expressions, functions of one or two expressions, and sparse matrix products (all of which may be interpreted as Legendre transformations); also the minimum and maximum of a set of expressions. These transformations introduce new interneurons which force the network to seek a saddle point rather than a minimum. Other transformations allow control of the network dynamics, by reconciling the Lagrangian formalism with the need for fixpoints. We apply the transformations to simplify a number of structured neural networks, beginning with the standard reduction of...
Connectionist Inference Systems
, 1991
"... This paper presents a survey of connectionist inference systems. ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
This paper presents a survey of connectionist inference systems.
Unification as constraint satisfaction in structured connectionist networks
 Neural Computation
, 1989
"... Unification is a basic concept in several traditional symbolic formalisms that should be wellsuited for a connectionist implementation due to the intuitive nature of the notions it formalizes. It is shown that by approaching unification from a graph matching and constraint satisfaction perspective ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Unification is a basic concept in several traditional symbolic formalisms that should be wellsuited for a connectionist implementation due to the intuitive nature of the notions it formalizes. It is shown that by approaching unification from a graph matching and constraint satisfaction perspective a natural and efficient realization in a structured connectionist network can be found. 1
A Comparative Study of Three Paradigms for Object Recognition  Bayesian Statistics, Neural Networks and Expert Systems.
 In Image Understanding: A Festschrift for Azriel Rosenfeld
, 1996
"... Object recognition, which involves the classification of objects into one of many a priori known object types, and determining object characteristics such as pose, is a difficult problem. A wide range of approaches have been proposed and applied to this problem with limited success. This paper prese ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
Object recognition, which involves the classification of objects into one of many a priori known object types, and determining object characteristics such as pose, is a difficult problem. A wide range of approaches have been proposed and applied to this problem with limited success. This paper presents a brief comparative study of methods from three different paradigms for object recognition: Bayesian, Neural Network and Expert Systems. 1 Introduction Recognizing 3dimensional (3D) objects from 2dimensional (2D) images is an important part of computer vision [1]. The success of most computer vision applications (robotics, automatic target recognition, surveillance, etc.) is closely tied with the reliability of the recognition of 3D objects or surfaces. The study of object recognition and the development of experimental object recognition systems has a great impact on the direction and content of research pursued by the computer vision community. Thus, it is not surprising that a plet...