Results 1  10
of
48
Guaranteed minimumrank solutions of linear matrix equations via nuclear norm minimization
, 2007
"... The affine rank minimization problem consists of finding a matrix of minimum rank that satisfies a given system of linear equality constraints. Such problems have appeared in the literature of a diverse set of fields including system identification and control, Euclidean embedding, and collaborative ..."
Abstract

Cited by 218 (15 self)
 Add to MetaCart
The affine rank minimization problem consists of finding a matrix of minimum rank that satisfies a given system of linear equality constraints. Such problems have appeared in the literature of a diverse set of fields including system identification and control, Euclidean embedding, and collaborative filtering. Although specific instances can often be solved with specialized algorithms, the general affine rank minimization problem is NPhard, because it contains vector cardinality minimization as a special case. In this paper, we show that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum rank solution can be recovered by solving a convex optimization problem, namely the minimization of the nuclear norm over the given affine space. We present several random ensembles of equations where the restricted isometry property holds with overwhelming probability, provided the codimension of the subspace is sufficiently large. The techniques used in our analysis have strong parallels in the compressed sensing framework. We discuss how affine rank minimization generalizes this preexisting concept and outline a dictionary relating concepts from cardinality minimization to those of rank minimization. We also discuss several algorithmic approaches to solving the norm minimization relaxations, and illustrate our results with numerical examples.
A Nonlinear Programming Algorithm for Solving Semidefinite Programs via Lowrank Factorization
 Mathematical Programming (series B
, 2001
"... In this paper, we present a nonlinear programming algorithm for solving semidefinite programs (SDPs) in standard form. The algorithm's distinguishing feature is a change of variables that replaces the symmetric, positive semidefinite variable X of the SDP with a rectangular variable R according to t ..."
Abstract

Cited by 104 (9 self)
 Add to MetaCart
In this paper, we present a nonlinear programming algorithm for solving semidefinite programs (SDPs) in standard form. The algorithm's distinguishing feature is a change of variables that replaces the symmetric, positive semidefinite variable X of the SDP with a rectangular variable R according to the factorization X = RR T . The rank of the factorization, i.e., the number of columns of R, is chosen minimally so as to enhance computational speed while maintaining equivalence with the SDP. Fundamental results concerning the convergence of the algorithm are derived, and encouraging computational results on some largescale test problems are also presented. Keywords: semidefinite programming, lowrank factorization, nonlinear programming, augmented Lagrangian, limited memory BFGS. 1 Introduction In the past few years, the topic of semidefinite programming, or SDP, has received considerable attention in the optimization community, where interest in SDP has included the investigation of...
Semidefinite representation of convex sets
, 2007
"... Abstract. Let S = {x ∈ R n: g1(x) ≥ 0, · · · , gm(x) ≥ 0} be a semialgebraic set defined by multivariate polynomials gi(x). Assume S is compact, convex and has nonempty interior. Let Si = {x ∈ R n: gi(x) ≥ 0} and ∂Si = {x ∈ R n: gi(x) = 0} be its boundary. This paper, as does the subject of s ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
Abstract. Let S = {x ∈ R n: g1(x) ≥ 0, · · · , gm(x) ≥ 0} be a semialgebraic set defined by multivariate polynomials gi(x). Assume S is compact, convex and has nonempty interior. Let Si = {x ∈ R n: gi(x) ≥ 0} and ∂Si = {x ∈ R n: gi(x) = 0} be its boundary. This paper, as does the subject of semidefinite programming (SDP), concerns Linear Matrix Inequalities (LMIs). The set S is said to have an LMI representation if it equals the set of solutions to some LMI and it is known that some convex S may not be LMI representable [6]. A question arising from [13], see [6, 14], is: given S ∈ R n, does there exist an LMI representable set ˆ S in some higher dimensional space R n+N whose projection down onto R n equals S. Such S is called semidefinite representable or SDP representable. This paper addresses the SDP representability problem. The following are the main contributions of this paper: (i) Assume gi(x) are all concave on S. If the positive definite Lagrange Hessian (PDLH) condition holds, i.e., the Hessian of the Lagrange function for optimization problem of minimizing any nonzero linear function ℓ T x on S is positive definite at the minimizer, then S is SDP representable. (ii) If each gi(x) is either sosconcave (− ∇ 2 gi(x) = W(x) T W(x) for some matrix polynomial W(x)) or strictly quasiconcave on S, then S is SDP representable. (iii) If each Si is either sosconvex or poscurvconvex (Si is compact, convex and has smooth boundary with positive curvature), then S is SDP representable. This also holds for Si for which ∂Si ∩ S extends smoothly to the boundary of a poscurvconvex set containing S. (iv) We give the complexity of Schmüdgen and Putinar’s matrix Positivstellensatz, which are critical to the proofs of (i)(iii). 1.
Maximum stable set formulations and heuristics based on continuous optimization
 MATH. PROGRAM., SER. A 94: 137–166 (2002)
, 2002
"... ..."
Sum of squares methods for sensor network localization
, 2006
"... We formulate the sensor network localization problem as finding the global minimizer of a quartic polynomial. Then sum of squares (SOS) relaxations can be applied to solve it. However, the general SOS relaxations are too expensive to implement for large problems. Exploiting the special features of t ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
We formulate the sensor network localization problem as finding the global minimizer of a quartic polynomial. Then sum of squares (SOS) relaxations can be applied to solve it. However, the general SOS relaxations are too expensive to implement for large problems. Exploiting the special features of this polynomial, we propose a new structured SOS relaxation, and discuss its various properties. When distances are given exactly, this SOS relaxation often returns true sensor locations. At each step of interior point methods solving this SOS relaxation, the complexity is O(n 3), where n is the number of sensors. When the distances have small perturbations, we show that the sensor locations given by this SOS relaxation are accurate within a constant factor of the perturbation error under some technical assumptions. The performance of this SOS relaxation is tested on some randomly generated problems.
A Unified Framework for Obtaining Improved Approximation Algorithms for Maximum Graph Bisection Problems
, 2002
"... We obtain improved semidefinite programming based approximation algorithms for all the natural maximum bisection problems of graphs. Among the problems considered are: MAX n/ BISECTION  partition the vertices of the graph into two sets of equal size such that the total weight of edges connecting ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
We obtain improved semidefinite programming based approximation algorithms for all the natural maximum bisection problems of graphs. Among the problems considered are: MAX n/ BISECTION  partition the vertices of the graph into two sets of equal size such that the total weight of edges connecting vertices from different sides is maximized; MAX n/2VERTEXCOVER  find a set containing half of the vertices such that the total weight of edges touching this set is maximized; MAX n/2DENSESUBGRAPH  find a set containing half of the vertices such that the total weight of edges connecting two vertices from this set is maximized; and MAX n/2UnCUT  partition the vertices into two sets of equal size such that the total weight of edges that do not cross the cut is maximized. We also consider the directed versions of these problems, such as MAX n/2DIRECTEDBISECTION and MAX n/2DIRECTEDUnCUT. These results can be used to obtain improved approximation algorithms for the unbalanced versions of the partition problems mentioned above, where we want to partition the graph into two sets of size k and n  k, where k is not necessarily n/2 . Our results improve, extend and unify results of Frieze and Jerrum, Feige and Langberg, Ye, and others. All these results may be viewed as extensions of the MAX CUT algorithm of Goemans and Williamson, and the MAX 2SAT and MAX DICUT algorithms of Feige and Goemans.
Sparse SOS relaxations for minimizing functions that are summations of small polynomials
 SIAM Journal On Optimization
, 2008
"... This paper discusses how to find the global minimum of functions that are summations of small polynomials (“small ” means involving a small number of variables). Some sparse sum of squares (SOS) techniques are proposed. We compare their computational complexity and lower bounds with prior SOS relaxa ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
This paper discusses how to find the global minimum of functions that are summations of small polynomials (“small ” means involving a small number of variables). Some sparse sum of squares (SOS) techniques are proposed. We compare their computational complexity and lower bounds with prior SOS relaxations. Under certain conditions, we also discuss how to extract the global minimizers from these sparse relaxations. The proposed methods are especially useful in solving sparse polynomial system and nonlinear least squares problems. Numerical experiments are presented, which show that the proposed methods significantly improve the computational performance of prior methods for solving these problems. Lastly, we present applications of this sparsity technique in solving polynomial systems derived from nonlinear differential equations and sensor network localization. Key words: Polynomials, sum of squares (SOS), sparsity, nonlinear least squares, polynomial system, nonlinear differential equations, sensor network localization 1
An Improved Semidefinite Programming Relaxation for the Satisfiability Problem
, 2002
"... The satisfiability (SAT) problem is a central problem in mathematical logic, computing theory, and artificial intelligence. An instance of SAT is specified by a set of boolean variables and a propositional formula in conjunctive normal form. Given such an instance, the SAT problem asks whether there ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
The satisfiability (SAT) problem is a central problem in mathematical logic, computing theory, and artificial intelligence. An instance of SAT is specified by a set of boolean variables and a propositional formula in conjunctive normal form. Given such an instance, the SAT problem asks whether there is a truth assignment to the variables such that the formula is satisfied. It is well known that SAT is in general NPcomplete, although several important special cases can be solved in polynomial time. Semidefinite programming (SDP) refers to the class of optimization problems where a linear function of a matrix variable X is maximized (or minimized) subject to linear constraints on the elements of X and the additional constraint that X be positive semidefinite. We are interested in the application of SDP to satisfiability problems, and in particular in how SDP can be used to detect unsatisfiability. In this paper we introduce a new SDP relaxation for the satisfiability problem. This SDP relaxation arises from the recently introduced paradigm of “higher liftings” for constructing semidefinite programming relaxations of discrete optimization problems.
Global minimization of rational functions and the nearest GCDs
 J. of Global Optimization
"... This paper discusses the global minimization of rational functions with or without constraints. The sum of squares (SOS) relaxations are proposed to find the global minimum and minimizers. Some special features of the SOS relaxations are studied. As an application, we show how to find the nearest co ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper discusses the global minimization of rational functions with or without constraints. The sum of squares (SOS) relaxations are proposed to find the global minimum and minimizers. Some special features of the SOS relaxations are studied. As an application, we show how to find the nearest common divisors of polynomials via global minimization of rational functions.