Results 1  10
of
13
Graph partitioning for high performance scientific simulations. Computing Reviews 45(2
, 2004
"... ..."
A New GraphTheoretic Approach to Clustering, with Applications to Computer Vision
, 2004
"... This work applies cluster analysis as a unified approach for a wide range of vision applications, thereby combining the research domain of computer vision and that of machine learning. Cluster analysis is the formal study of algorithms and methods for recovering the inherent structure within a given ..."
Abstract

Cited by 44 (4 self)
 Add to MetaCart
This work applies cluster analysis as a unified approach for a wide range of vision applications, thereby combining the research domain of computer vision and that of machine learning. Cluster analysis is the formal study of algorithms and methods for recovering the inherent structure within a given dataset. Many problems of computer vision have precisely this goal, namely to find which visual entities belong to an inherent structure, e.g. in an image or in a database of images. For example, a meaningful structure in the context of image segmentation is a set of pixels which correspond to the same object in a scene. Clustering algorithms can be used to partition the pixels of an image into meaningful parts, which may correspond to different objects. In this work we focus on the problems of image segmentation and image database organization. The visual entities to consider are pixels and images, respectively. Our first contribution in this work is a novel partitional (flat) clustering algorithm. The algorithm uses pairwise representation, where the visual objects (pixels,
Minimizing a Quadratic Over a Sphere
 SIAM J. Optim
, 2000
"... A new method, the sequential subspace method (SSM), is developed for the problem of minimizing a quadratic over a sphere. In our scheme, the quadratic is minimized over a subspace which is adjusted in successive iterations to ensure convergence to an optimum. When a sequential quadratic programming ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
A new method, the sequential subspace method (SSM), is developed for the problem of minimizing a quadratic over a sphere. In our scheme, the quadratic is minimized over a subspace which is adjusted in successive iterations to ensure convergence to an optimum. When a sequential quadratic programming iterate is included in the subspace, convergence is locally quadratic. Numerical comparisons with other recent methods are given.
Global convergence of SSM for minimizing a quadratic over a sphere
 Math. Comp
, 2004
"... Abstract. In an earlier paper [Minimizing a quadratic over a sphere, SIAM J. Optim., 12 (2001), 188–208], we presented the sequential subspace method (SSM) for minimizing a quadratic over a sphere. This method generates approximations to a minimizer by carrying out the minimization over a sequence o ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract. In an earlier paper [Minimizing a quadratic over a sphere, SIAM J. Optim., 12 (2001), 188–208], we presented the sequential subspace method (SSM) for minimizing a quadratic over a sphere. This method generates approximations to a minimizer by carrying out the minimization over a sequence of subspaces that are adjusted after each iterate is computed. We showed in this earlier paper that when the subspace contains a vector obtained by applying one step of Newton’s method to the firstorder optimality system, SSM is locally, quadratically convergent, even when the original problem is degenerate with multiple solutions and with a singular Jacobian in the optimality system. In this paper, we prove (nonlocal) convergence of SSM to a global minimizer whenever each SSM subspace contains the following three vectors: (i) the current iterate, (ii) the gradient of the cost function evaluated at the current iterate, and (iii) an eigenvector associated with the smallest eigenvalue of the cost function Hessian. For nondegenerate problems, the convergence rate is at least linear when vectors (i)–(iii) are included in the SSM subspace. 1.
Constructing Test Functions for Global Optimization Using Continuous Formulations of Graph Problems
 Optimization Methods and Software
, 2005
"... A method for constructing test functions for global optimization which utilizes continuous formulations of combinatorial optimization problems is suggested. In particular, global optimization formulations for the maximum independent set, maximum clique and MAX CUT problems on arbitrary graphs are co ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
A method for constructing test functions for global optimization which utilizes continuous formulations of combinatorial optimization problems is suggested. In particular, global optimization formulations for the maximum independent set, maximum clique and MAX CUT problems on arbitrary graphs are considered, and proofs for some of them are given. A number of sample test functions based on these formulations are proposed.
Solving the quadratic trustregion subproblem in a lowmemory BFGS framework
 OPTIMIZATION METHODS AND SOFTWARE
, 2008
"... We present a new matrixfree method for the largescale trustregion subproblem, assuming that the approximate Hessian is updated by the LBFGS formula with m = 1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
We present a new matrixfree method for the largescale trustregion subproblem, assuming that the approximate Hessian is updated by the LBFGS formula with m = 1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix whose inverse can be expressed analytically, without using factorization. Consequently, a direction of negative curvature can be computed immediately by applying the inverse power method. The computation of the trial step is obtained by performing a sequence of inner products and vector summations. Furthermore, it immediately follows that the strong convergence properties of trust region methods are preserved. Numerical results are also presented.
Multiset Graph Partitioning
 Math.MethodsOper.Res
, 2001
"... . Optimality conditions are given for a quadratic programming formulation of the multiset graph partitioning problem. These conditions are related to the structure of the graph and properties of the weights. Key words. graph partitioning, mincut, maxcut, quadratic programming, optimality conditio ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
. Optimality conditions are given for a quadratic programming formulation of the multiset graph partitioning problem. These conditions are related to the structure of the graph and properties of the weights. Key words. graph partitioning, mincut, maxcut, quadratic programming, optimality conditions AMS(MOS) subject classications. 90C35, 90C27, 90C20 This work was supported by the National Science Foundation. 1 1.
J.T.: A Continuous Quadratic Programming Formulation of the Vertex Separator Problem
"... Abstract. The Vertex Separator Problem (VSP) for a graph is to find the smallest collection of vertices whose removal breaks the graph into two disconnected subsets of roughly equal size. In a recent paper (Optimality Conditions For Maximizing a Function Over a Polyhedron, Mathematical Programming, ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. The Vertex Separator Problem (VSP) for a graph is to find the smallest collection of vertices whose removal breaks the graph into two disconnected subsets of roughly equal size. In a recent paper (Optimality Conditions For Maximizing a Function Over a Polyhedron, Mathematical Programming, 2013, doi: 10.1007/s1010701306441), the authors announced a new continuous bilinear quadratic programming formulation of the VSP, and they used this quadratic programming problem to illustrate the new optimality conditions. The current paper develops conditions for the equivalence between this continuous quadratic program and the vertex separator problem, and it examines the relationship between the continuous formulation of the VSP and continuous quadratic programming formulations for both the edge separator problem and maximum clique problem.