Results 1  10
of
79
SDPT3  a MATLAB software package for semidefinite programming
 OPTIMIZATION METHODS AND SOFTWARE
, 1999
"... This software package is a Matlab implementation of infeasible pathfollowing algorithms for solving standard semidefinite programming (SDP) problems. Mehrotratype predictorcorrector variants are included. Analogous algorithms for the homogeneous formulation of the standard SDP problem are also imp ..."
Abstract

Cited by 362 (17 self)
 Add to MetaCart
(Show Context)
This software package is a Matlab implementation of infeasible pathfollowing algorithms for solving standard semidefinite programming (SDP) problems. Mehrotratype predictorcorrector variants are included. Analogous algorithms for the homogeneous formulation of the standard SDP problem are also implemented. Four types of search directions are available, namely, the AHO, HKM, NT, and GT directions. A few classes of SDP problems are included as well. Numerical results for these classes show that our algorithms are fairly efficient and robust on problems with dimensions of the order of a few hundreds.
Solving semidefinitequadraticlinear programs using SDPT3
 MATHEMATICAL PROGRAMMING
, 2003
"... This paper discusses computational experiments with linear optimization problems involving semidefinite, quadratic, and linear cone constraints (SQLPs). Many test problems of this type are solved using a new release of SDPT3, a Matlab implementation of infeasible primaldual pathfollowing algorithm ..."
Abstract

Cited by 233 (22 self)
 Add to MetaCart
(Show Context)
This paper discusses computational experiments with linear optimization problems involving semidefinite, quadratic, and linear cone constraints (SQLPs). Many test problems of this type are solved using a new release of SDPT3, a Matlab implementation of infeasible primaldual pathfollowing algorithms. The software developed by the authors uses Mehrotratype predictorcorrector variants of interiorpoint methods and two types of search directions: the HKM and NT directions. A discussion of implementation details is provided and computational results on problems from the SDPLIB and DIMACS Challenge collections are reported.
Solving LargeScale Sparse Semidefinite Programs for Combinatorial Optimization
 SIAM JOURNAL ON OPTIMIZATION
, 1998
"... We present a dualscaling interiorpoint algorithm and show how it exploits the structure and sparsity of some large scale problems. We solve the positive semidefinite relaxation of combinatorial and quadratic optimization problems subject to boolean constraints. We report the first computational re ..."
Abstract

Cited by 119 (11 self)
 Add to MetaCart
(Show Context)
We present a dualscaling interiorpoint algorithm and show how it exploits the structure and sparsity of some large scale problems. We solve the positive semidefinite relaxation of combinatorial and quadratic optimization problems subject to boolean constraints. We report the first computational results of interiorpoint algorithms for approximating the maximum cut semidefinite programs with dimension upto 3000.
SDPA (SemiDefinite Programming Algorithm) User's Manual  Version 7.0.5
, 2008
"... The SDPA (SemiDefinite Programming Algorithm) [5] is a software package for solving semidefinite programs (SDPs). It is based on a Mehrotratype predictorcorrector infeasible primaldual interiorpoint method. The SDPA handles the standard form SDP and its dual. It is implemented in C++ language u ..."
Abstract

Cited by 110 (31 self)
 Add to MetaCart
(Show Context)
The SDPA (SemiDefinite Programming Algorithm) [5] is a software package for solving semidefinite programs (SDPs). It is based on a Mehrotratype predictorcorrector infeasible primaldual interiorpoint method. The SDPA handles the standard form SDP and its dual. It is implemented in C++ language utilizing the LAPACK [1] for matrix computations. The SDPA version 7.0.5 enjoys the following features: • Efficient method for computing the search directions when the SDP to be solved is large scale and sparse [4]. • Block diagonal matrix structure and sparse matrix structure are supported for data matrices. • Sparse or dense Cholesky factorization for the Schur matrix is automatically selected. • An initial point can be specified. • Some information on infeasibility of the SDP is provided. This manual and the SDPA can be downloaded from the WWW site
Exploiting Sparsity in Semidefinite Programming via Matrix Completion I: General Framework
 SIAM JOURNAL ON OPTIMIZATION
, 1999
"... A critical disadvantage of primaldual interiorpoint methods against dual interiorpoint methods for large scale SDPs (semidefinite programs) has been that the primal positive semidefinite variable matrix becomes fully dense in general even when all data matrices are sparse. Based on some fundamenta ..."
Abstract

Cited by 104 (30 self)
 Add to MetaCart
(Show Context)
A critical disadvantage of primaldual interiorpoint methods against dual interiorpoint methods for large scale SDPs (semidefinite programs) has been that the primal positive semidefinite variable matrix becomes fully dense in general even when all data matrices are sparse. Based on some fundamental results about positive semidefinite matrix completion, this article proposes a general method of exploiting the aggregate sparsity pattern over all data matrices to overcome this disadvantage. Our method is used in two ways. One is a conversion of a sparse SDP having a large scale positive semidefinite variable matrix into an SDP having multiple but smaller size positive semidefinite variable matrices to which we can effectively apply any interiorpoint method for SDPs employing a standard blockdiagonal matrix data structure. The other way is an incorporation of our method into primaldual interiorpoint methods which we can apply directly to a given SDP. In Part II of this article, we wi...
Interiorpoint method for nuclear norm approximation with application to system identification
"... ..."
Exploiting sparsity in semidefinite programming via matrix completion II: implementation and numerical results
"... In Part I of this series of articles, we introduced a general framework of exploiting the aggregate sparsity pattern over all data matrices of large scale and sparse semidefinite programs (SDPs) when solving them by primaldual interiorpoint methods. This framework is based on some results about po ..."
Abstract

Cited by 50 (17 self)
 Add to MetaCart
(Show Context)
In Part I of this series of articles, we introduced a general framework of exploiting the aggregate sparsity pattern over all data matrices of large scale and sparse semidefinite programs (SDPs) when solving them by primaldual interiorpoint methods. This framework is based on some results about positive semidefinite matrix completion, and it can be embodied in two di#erent ways. One is by a conversion of a given sparse SDP having a large scale positive semidefinite matrix variable into an SDP having multiple but smaller positive semidefinite matrix variables. The other is by incorporating a positive definite matrix completion itself in a primaldual interiorpoint method. The current article presents the details of their implementations. We introduce new techniques to deal with the sparsity through a clique tree in the former method and through new computational formulae in the latter one. Numerical results over di#erent classes of SDPs show that these methods can be very e#cient for some problems. Keywords: Semidefinite programming; Primaldual interiorpoint method; Matrix completion problem; Clique tree; Numerical results. # Department of Applied Physics, The University of Tokyo, 731 Hongo, Bunkyoku, Tokyo 1138565 Japan (nakata@zzz.t.utokyo.ac.jp ). + Department of Architecture and Architectural Systems, Kyoto University, Kyoto 6068501 Japan (fujisawa@ismj.archi.kyotou.ac.jp). # Department of Mathematical and Computing Sciences, Tokyo Institute of Technology, 2121 OhOkayama, Meguroku, Tokyo 1528552 Japan (mituhiro@is.titech.ac.jp). The author was supported by The Ministry of Education, Culture, Sports, Science and Technology of Japan. Department of Mathematical and Computing Sciences, Tokyo Institute of Technology, 2121 OhOkayama, Meguroku, Toky...
Using SeDuMi 1.0x , A Matlab TOOLBOX FOR OPTIMIZATION OVER SYMMETRIC CONES
, 1999
"... SeDuMi is an addon for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This p ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
SeDuMi is an addon for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This paper describes how to work with this toolbox.
RankTwo Relaxation Heuristics for MaxCut and Other Binary Quadratic Programs
 SIAM Journal on Optimization
, 2000
"... The GoemansWilliamson randomized algorithm guarantees a highquality approximation to the MaxCut problem, but the cost associated with such an approximation can be excessively high for largescale problems due to the need for solving an expensive semidefinite relaxation. In order to achieve better ..."
Abstract

Cited by 43 (3 self)
 Add to MetaCart
(Show Context)
The GoemansWilliamson randomized algorithm guarantees a highquality approximation to the MaxCut problem, but the cost associated with such an approximation can be excessively high for largescale problems due to the need for solving an expensive semidefinite relaxation. In order to achieve better practical performance, we propose an alternative, ranktwo relaxation and develop a specialized version of the GoemansWilliamson technique. The proposed approach leads to continuous optimization heuristics applicable to MaxCut as well as other binary quadratic programs, for example the MaxBisection problem. A computer code based on the ranktwo relaxation heuristics is compared with two stateoftheart semidefinite programming codes that implement the GoemansWilliamson randomized algorithm, as well as with a purely heuristic code for effectively solving a particular MaxCut problem arising in physics. Computational results show that the proposed approach is fast and scalable and, more importantly, attains a higher approximation quality in practice than that of the GoemansWilliamson randomized algorithm. An extension to MaxBisection is also discussed as well as an important difference between the proposed approach and the GoemansWilliamson algorithm, namely that the new approach does not guarantee an upper bound on the MaxCut optimal value. Key words. Binary quadratic programs, MaxCut and MaxBisection, semidefinite relaxation, ranktwo relaxation, continuous optimization heuristics. AMS subject classifications. 90C06, 90C27, 90C30 1.
On Copositive Programming and Standard Quadratic Optimization Problems
 Journal of Global Optimization
, 2000
"... A standard quadratic problem consists of finding global maximizers of a quadratic form over the standard simplex. In this paper, the usual semidefinite programming relaxation is strengthened by replacing the cone of positive semidefinite matrices by the cone of completely positive matrices (the posi ..."
Abstract

Cited by 41 (9 self)
 Add to MetaCart
A standard quadratic problem consists of finding global maximizers of a quadratic form over the standard simplex. In this paper, the usual semidefinite programming relaxation is strengthened by replacing the cone of positive semidefinite matrices by the cone of completely positive matrices (the positive semidefinite matrices which allow a factorization FF^T where F is some nonnegative matrix). The dual of this cone is the cone of copositive matrices (i.e., those matrices which yield a nonnegative quadratic form on the positive orthant). This conic formulation allows us to employ primaldual affinescaling directions. Furthermore, these approaches are combined with an evolutionary dynamics algorithm which generates primalfeasible paths along which the objective is monotonically improved until a local solution is reached. In particular, the primaldual affine scaling directions are used to escape from local maxima encountered during the evolutionary dynamics phase.