Results 11 - 20
of
14,654
A Separator Theorem for Planar Graphs
, 1977
"... Let G be any n-vertex planar graph. We prove that the vertices of G can be partitioned into three sets A, B, C such that no edge joins a vertex in A with a vertex in B, neither A nor B contains more than 2n/3 vertices, and C contains no more than 2& & vertices. We exhibit an algorithm which ..."
Abstract
-
Cited by 461 (1 self)
- Add to MetaCart
Let G be any n-vertex planar graph. We prove that the vertices of G can be partitioned into three sets A, B, C such that no edge joins a vertex in A with a vertex in B, neither A nor B contains more than 2n/3 vertices, and C contains no more than 2& & vertices. We exhibit an algorithm which
Robust convex optimization
- Mathematics of Operations Research
, 1998
"... We study convex optimization problems for which the data is not specified exactly and it is only known to belong to a given uncertainty set U, yet the constraints must hold for all possible values of the data from U. The ensuing optimization problem is called robust optimization. In this paper we la ..."
Abstract
-
Cited by 416 (21 self)
- Add to MetaCart
We study convex optimization problems for which the data is not specified exactly and it is only known to belong to a given uncertainty set U, yet the constraints must hold for all possible values of the data from U. The ensuing optimization problem is called robust optimization. In this paper we
Interior-point Methods
, 2000
"... The modern era of interior-point methods dates to 1984, when Karmarkar proposed his algorithm for linear programming. In the years since then, algorithms and software for linear programming have become quite sophisticated, while extensions to more general classes of problems, such as convex quadrati ..."
Abstract
-
Cited by 612 (15 self)
- Add to MetaCart
, monotone linear complementarity, and convex programming over sets that can be characterized by self-concordant barrier functions.
Global Optimization with Polynomials and the Problem of Moments
- SIAM JOURNAL ON OPTIMIZATION
, 2001
"... We consider the problem of finding the unconstrained global minimum of a real-valued polynomial p(x) : R R, as well as the global minimum of p(x), in a compact set K defined by polynomial inequalities. It is shown that this problem reduces to solving an (often finite) sequence of convex linear ma ..."
Abstract
-
Cited by 577 (48 self)
- Add to MetaCart
We consider the problem of finding the unconstrained global minimum of a real-valued polynomial p(x) : R R, as well as the global minimum of p(x), in a compact set K defined by polynomial inequalities. It is shown that this problem reduces to solving an (often finite) sequence of convex linear
The Extended Linear Complementarity Problem
, 1993
"... We consider an extension of the horizontal linear complementarity problem, which we call the extended linear complementarity problem (XLCP). With the aid of a natural bilinear program, we establish various properties of this extended complementarity problem; these include the convexity of the biline ..."
Abstract
-
Cited by 788 (30 self)
- Add to MetaCart
We consider an extension of the horizontal linear complementarity problem, which we call the extended linear complementarity problem (XLCP). With the aid of a natural bilinear program, we establish various properties of this extended complementarity problem; these include the convexity
Convergence Properties of the Nelder-Mead Simplex Method in Low Dimensions
- SIAM Journal of Optimization
, 1998
"... Abstract. The Nelder–Mead simplex algorithm, first published in 1965, is an enormously popular direct search method for multidimensional unconstrained minimization. Despite its widespread use, essentially no theoretical results have been proved explicitly for the Nelder–Mead algorithm. This paper pr ..."
Abstract
-
Cited by 598 (3 self)
- Add to MetaCart
in two dimensions and a set of initial conditions for which the Nelder–Mead algorithm converges to a nonminimizer. It is not yet known whether the Nelder–Mead method can be proved to converge to a minimizer for a more specialized class of convex functions in two dimensions. Key words. direct search
Lambertian Reflectance and Linear Subspaces
, 2000
"... We prove that the set of all reflectance functions (the mapping from surface normals to intensities) produced by Lambertian objects under distant, isotropic lighting lies close to a 9D linear subspace. This implies that, in general, the set of images of a convex Lambertian object obtained under a wi ..."
Abstract
-
Cited by 526 (20 self)
- Add to MetaCart
We prove that the set of all reflectance functions (the mapping from surface normals to intensities) produced by Lambertian objects under distant, isotropic lighting lies close to a 9D linear subspace. This implies that, in general, the set of images of a convex Lambertian object obtained under a
A Singular Value Thresholding Algorithm for Matrix Completion
, 2008
"... This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem, and arises in many important applications as in the task of reco ..."
Abstract
-
Cited by 555 (22 self)
- Add to MetaCart
This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem, and arises in many important applications as in the task
Amortized Efficiency of List Update and Paging Rules
, 1985
"... In this article we study the amortized efficiency of the “move-to-front” and similar rules for dynamically maintaining a linear list. Under the assumption that accessing the ith element from the front of the list takes 0(i) time, we show that move-to-front is within a constant factor of optimum amo ..."
Abstract
-
Cited by 824 (8 self)
- Add to MetaCart
paging, a setting in which the access cost is not convex. The paging rule corresponding to move-to-front is the “least recently used” (LRU) replacement rule. We analyze the amortized complexity of LRU, showing that its efficiency differs from that of the offline paging rule (Belady’s MIN algorithm) by a
Distance metric learning for large margin nearest neighbor classification
- In NIPS
, 2006
"... We show how to learn a Mahanalobis distance metric for k-nearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract
-
Cited by 695 (14 self)
- Add to MetaCart
. On seven data sets of varying size and difficulty, we find that metrics trained in this way lead to significant improvements in kNN classification—for example, achieving a test error rate of 1.3 % on the MNIST handwritten digits. As in support vector machines (SVMs), the learning problem reduces to a
Results 11 - 20
of
14,654