Results 1  10
of
132
A fast and high quality multilevel scheme for partitioning irregular graphs
 SIAM JOURNAL ON SCIENTIFIC COMPUTING
, 1998
"... Recently, a number of researchers have investigated a class of graph partitioning algorithms that reduce the size of the graph by collapsing vertices and edges, partition the smaller graph, and then uncoarsen it to construct a partition for the original graph [Bui and Jones, Proc. ..."
Abstract

Cited by 797 (12 self)
 Add to MetaCart
Recently, a number of researchers have investigated a class of graph partitioning algorithms that reduce the size of the graph by collapsing vertices and edges, partition the smaller graph, and then uncoarsen it to construct a partition for the original graph [Bui and Jones, Proc.
A New Method for Solving Hard Satisfiability Problems
 AAAI
, 1992
"... We introduce a greedy local search procedure called GSAT for solving propositional satisfiability problems. Our experiments show that this procedure can be used to solve hard, randomly generated problems that are an order of magnitude larger than those that can be handled by more traditional approac ..."
Abstract

Cited by 683 (21 self)
 Add to MetaCart
We introduce a greedy local search procedure called GSAT for solving propositional satisfiability problems. Our experiments show that this procedure can be used to solve hard, randomly generated problems that are an order of magnitude larger than those that can be handled by more traditional approaches such as the DavisPutnam procedure or resolution. We also show that GSAT can solve structured satisfiability problems quickly. In particular, we solve encodings of graph coloring problems, Nqueens, and Boolean induction. General application strategies and limitations of the approach are also discussed. GSAT is best viewed as a modelfinding procedure. Its good performance suggests that it may be advantageous to reformulate reasoning tasks that have traditionally been viewed as theoremproving problems as modelfinding tasks.
A Graduated Assignment Algorithm for Graph Matching
, 1996
"... A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational comp ..."
Abstract

Cited by 285 (15 self)
 Add to MetaCart
A graduated assignment algorithm for graph matching is presented which is fast and accurate even in the presence of high noise. By combining graduated nonconvexity, twoway (assignment) constraints, and sparsity, large improvements in accuracy and speed are achieved. Its low order computational complexity [O(lm), where l and m are the number of links in the two graphs] and robustness in the presence of noise offer advantages over traditional combinatorial approaches. The algorithm, not restricted to any special class of graph, is applied to subgraph isomorphism, weighted graph matching, and attributed relational graph matching. To illustrate the performance of the algorithm, attributed relational graphs derived from objects are matched. Then, results from twentyfive thousand experiments conducted on 100 node random graphs of varying types (graphs with only zeroone links, weighted graphs, and graphs with node attributes and multiple link types) are reported. No comparable results have...
Local Search Strategies for Satisfiability Testing
 DIMACS SERIES IN DISCRETE MATHEMATICS AND THEORETICAL COMPUTER SCIENCE
, 1995
"... It has recently been shown that local search is surprisingly good at finding satisfying assignments for certain classes of CNF formulas [24]. In this paper we demonstrate that the power of local search for satisfiability testing can be further enhanced by employinga new strategy, called "mixed rando ..."
Abstract

Cited by 270 (25 self)
 Add to MetaCart
It has recently been shown that local search is surprisingly good at finding satisfying assignments for certain classes of CNF formulas [24]. In this paper we demonstrate that the power of local search for satisfiability testing can be further enhanced by employinga new strategy, called "mixed random walk", for escaping from local minima. We present experimental results showing how this strategy allows us to handle formulas that are substantially larger than those that can be solved with basic local search. We also present a detailed comparison of our random walk strategy with simulated annealing. Our results show that mixed random walk is the superior strategy on several classes of computationally difficult problem instances. Finally, we present results demonstrating the effectiveness of local search with walk for solving circuit synthesis and diagnosis problems.
Multilevel hypergraph partitioning: Application in VLSI domain
 IEEE TRANS. VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS
, 1999
"... In this paper, we present a new hypergraphpartitioning algorithm that is based on the multilevel paradigm. In the multilevel paradigm, a sequence of successively coarser hypergraphs is constructed. A bisection of the smallest hypergraph is computed and it is used to obtain a bisection of the origina ..."
Abstract

Cited by 241 (21 self)
 Add to MetaCart
In this paper, we present a new hypergraphpartitioning algorithm that is based on the multilevel paradigm. In the multilevel paradigm, a sequence of successively coarser hypergraphs is constructed. A bisection of the smallest hypergraph is computed and it is used to obtain a bisection of the original hypergraph by successively projecting and refining the bisection to the next level finer hypergraph. We have developed new hypergraph coarsening strategies within the multilevel framework. We evaluate their performance both in terms of the size of the hyperedge cut on the bisection, as well as on the run time for a number of very large scale integration circuits. Our experiments show that our multilevel hypergraphpartitioning algorithm produces highquality partitioning in a relatively small amount of time. The quality of the partitionings produced by our scheme are on the average 6%–23 % better than those produced by other stateoftheart schemes. Furthermore, our partitioning algorithm is significantly faster, often requiring 4–10 times less time than that required by the other schemes. Our multilevel hypergraphpartitioning algorithm scales very well for large hypergraphs. Hypergraphs with over 100 000 vertices can be bisected in a few minutes on today’s workstations. Also, on the large hypergraphs, our scheme outperforms other schemes (in hyperedge cut) quite consistently with larger margins (9%–30%).
A New Point Matching Algorithm for NonRigid Registration
, 2002
"... Featurebased methods for nonrigid registration frequently encounter the correspondence problem. Regardless of whether points, lines, curves or surface parameterizations are used, featurebased nonrigid matching requires us to automatically solve for correspondences between two sets of features. I ..."
Abstract

Cited by 235 (2 self)
 Add to MetaCart
Featurebased methods for nonrigid registration frequently encounter the correspondence problem. Regardless of whether points, lines, curves or surface parameterizations are used, featurebased nonrigid matching requires us to automatically solve for correspondences between two sets of features. In addition, there could be many features in either set that have no counterparts in the other. This outlier rejection problem further complicates an already di#cult correspondence problem. We formulate featurebased nonrigid registration as a nonrigid point matching problem. After a careful review of the problem and an indepth examination of two types of methods previously designed for rigid robust point matching (RPM), we propose a new general framework for nonrigid point matching. We consider it a general framework because it does not depend on any particular form of spatial mapping. We have also developed an algorithmthe TPSRPM algorithmwith the thinplate spline (TPS) as the parameterization of the nonrigid spatial mapping and the softassign for the correspondence. The performance of the TPSRPM algorithm is demonstrated and validated in a series of carefully designed synthetic experiments. In each of these experiments, an empirical comparison with the popular iterated closest point (ICP) algorithm is also provided. Finally, we apply the algorithm to the problem of nonrigid registration of cortical anatomical structures which is required in brain mapping. While these results are somewhat preliminary, they clearly demonstrate the applicability of our approach to real world tasks involving featurebased nonrigid registration.
DomainIndependent Extensions to GSAT: Solving Large Structured Satisfiability Problems
 PROC. IJCAI93
, 1993
"... GSAT is a randomized local search procedure for solving propositional satisfiability problems (Selman et al. 1992). GSAT can solve hard, randomly generated problems that are an order of magnitude larger than those that can be handled by more traditional approaches such as the DavisPutnam proc ..."
Abstract

Cited by 216 (12 self)
 Add to MetaCart
GSAT is a randomized local search procedure for solving propositional satisfiability problems (Selman et al. 1992). GSAT can solve hard, randomly generated problems that are an order of magnitude larger than those that can be handled by more traditional approaches such as the DavisPutnam procedure. GSAT also efficiently solves encodings of graph coloring problems, Nqueens, and Boolean induction. However, GSAT does not perform as well on handcrafted encodings of blocksworld planning problems and formulas with a high degree of asymmetry. We present three strategies that dramatically improve GSAT's performance on such formulas. These strategies, in effect, manage to uncover hidden structure in the formula under considerations, thereby significantly extending the applicability of the GSAT algorithm.
A maximum likelihood stereo algorithm
 Computer Vision and Image Understanding
, 1996
"... A stereo algorithm is presented that optimizes a maximum likelihood cost function. The maximum likelihood cost function assumes that corresponding features in the left and right images are Normally distributed about a common true value and consists of a weighted squared error term if two features ar ..."
Abstract

Cited by 197 (2 self)
 Add to MetaCart
A stereo algorithm is presented that optimizes a maximum likelihood cost function. The maximum likelihood cost function assumes that corresponding features in the left and right images are Normally distributed about a common true value and consists of a weighted squared error term if two features are matched or a ( xed) cost if a feature is determined to be occluded. The stereo algorithm nds the set of correspondences that maximize the cost function subject to ordering and uniqueness constraints. The stereo algorithm is independent of the matching primitives. However, for the experiments described in this paper, matching is performed on the individual pixel intensities. Contrary to popular belief, the pixelbased stereo appears to be robust for a variety of images. It also has the advantages of (i) providing a dense disparity map, (ii) requiring no feature extraction and (iii) avoiding the adaptive windowing problem of areabased correlation methods. Because feature extraction and windowing are unnecessary, avery fast implementation is possible. Experimental results reveal that good stereo correspondences can be found using only ordering and uniqueness constraints, i.e. without local smoothness constraints. However, it is shown that the original maximum likelihood stereo algorithm exhibits multiple global minima. The dynamic programming algorithm is guaranteed to nd one, but not necessarily the same one for each epipolar scanline causing erroneous
A New Algorithm for NonRigid Point Matching
 IN CVPR
, 2000
"... We present a new robust point matching algorithm (RPM) that can jointly estimate the correspondence and nonrigid transformations between two pointsets that may be of different sizes. The algorithm utilizes the softassign for the correspondence and the thinplate spline for the nonrigid mapping. E ..."
Abstract

Cited by 157 (7 self)
 Add to MetaCart
We present a new robust point matching algorithm (RPM) that can jointly estimate the correspondence and nonrigid transformations between two pointsets that may be of different sizes. The algorithm utilizes the softassign for the correspondence and the thinplate spline for the nonrigid mapping. Embedded within a deterministic annealing framework, the algorithm can automatically reject a fraction of the points as outliers. Experiments on both 2D synthetic pointsets with varying degrees of deformation, noise and outliers, and on real 3D sulcal pointsets (extracted from brain MRI) demonstrate the robustness of the algorithm.
New Algorithms for 2D and 3D Point Matching: Pose Estimation and Correspondence
"... A fundamental open problem in computer visiondetermining pose and correspondence between two sets of points in spaceis solved with a novel, fast [O(nm)], robust and easily implementable algorithm. The technique works on noisy 2D or 3D point sets that may be of unequal sizes and may differ by n ..."
Abstract

Cited by 85 (19 self)
 Add to MetaCart
A fundamental open problem in computer visiondetermining pose and correspondence between two sets of points in spaceis solved with a novel, fast [O(nm)], robust and easily implementable algorithm. The technique works on noisy 2D or 3D point sets that may be of unequal sizes and may differ by nonrigid transformations. Using a combination of optimization techniques such as deterministic annealing and the softassign, which have recently emerged out of the recurrent neural network/statistical physics framework, analog objective functions describing the problems are minimized. Over thirty thousand experiments, on randomly generated points sets with varying amounts of noise and missing and spurious points, and on handwritten character sets demonstrate the robustness of the algorithm. Keywords: Pointmatching, pose estimation, correspondence, neural networks, optimization, softassign, deterministic annealing, affine. 1 Introduction Matching the representations of two images has long...