Results 1  10
of
129
Semidefinite Programming Relaxations for Semialgebraic Problems
, 2001
"... A hierarchy of convex relaxations for semialgebraic problems is introduced. For questions reducible to a finite number of polynomial equalities and inequalities, it is shown how to construct a complete family of polynomially sized semidefinite programming conditions that prove infeasibility. The mai ..."
Abstract

Cited by 297 (22 self)
 Add to MetaCart
A hierarchy of convex relaxations for semialgebraic problems is introduced. For questions reducible to a finite number of polynomial equalities and inequalities, it is shown how to construct a complete family of polynomially sized semidefinite programming conditions that prove infeasibility. The main tools employed are a semidefinite programming formulation of the sum of squares decomposition for multivariate polynomials, and some results from real algebraic geometry. The techniques provide a constructive approach for finding bounded degree solutions to the Positivstellensatz, and are illustrated with examples from diverse application fields.
SOSTOOLS: Sum of squares optimization toolbox for MATLAB
, 2004
"... Version 2.00 ..."
(Show Context)
A comparison of the SheraliAdams, LovászSchrijver and Lasserre relaxations for 01 programming
 Mathematics of Operations Research
, 2001
"... ..."
(Show Context)
Introducing SOSTOOLS: A General Purpose Sum of Squares Programming Solver
 Proceedings of the IEEE Conference on Decision and Control (CDC), Las Vegas, NV
, 2002
"... SOSTOOLS is a MATLAB toolbox for constructing and solving sum of squares programs. It can be used in combination with semidefinite programming software, such as SeDuMi, to solve many continuous and combinatorial optimization problems, as well as various controlrelated problems. This paper provides ..."
Abstract

Cited by 64 (15 self)
 Add to MetaCart
(Show Context)
SOSTOOLS is a MATLAB toolbox for constructing and solving sum of squares programs. It can be used in combination with semidefinite programming software, such as SeDuMi, to solve many continuous and combinatorial optimization problems, as well as various controlrelated problems. This paper provides an overview on sum of squares programming, describes the primary features of SOSTOOLS, and shows how SOSTOOLS is used to solve sum of squares programs. Some applications from different areas are presented to show the wide applicability of sum of squares programming in general and SOSTOOLS in particular. 1
Sums of squares of regular functions on real algebraic varieties
 Tran. Amer. Math. Soc
, 1999
"... Abstract. Let V be an affine algebraic variety over R (or any other real closed field R). We ask when it is true that every positive semidefinite (psd) polynomial function on V is a sum of squares (sos). We show that for dim V ≥ 3 the answer is always negative if V has a real point. Also, if V is a ..."
Abstract

Cited by 50 (10 self)
 Add to MetaCart
(Show Context)
Abstract. Let V be an affine algebraic variety over R (or any other real closed field R). We ask when it is true that every positive semidefinite (psd) polynomial function on V is a sum of squares (sos). We show that for dim V ≥ 3 the answer is always negative if V has a real point. Also, if V is a smooth nonrational curve all of whose points at infinity are real, the answer is again negative. The same holds if V is a smooth surface with only real divisors at infinity. The “compact ” case is harder. We completely settle the case of smooth curves of genus ≤ 1: If such a curve has a complex point at infinity, then every psd function is sos, provided the field R is archimedean. If R is not archimedean, there are counterexamples of genus 1.
Minimizing polynomials via sum of squares over the gradient ideal
 Math. Program
"... A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown ..."
Abstract

Cited by 43 (14 self)
 Add to MetaCart
A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown to be SOS modulo its gradient ideal, provided the gradient ideal is radical or the polynomial is strictly positive on the gradient variety. This opens up the possibility of solving previously intractable polynomial optimization problems. The related problem of constrained minimization is also considered, and numerical examples are discussed. Experiments show that our method using the gradient variety outperforms prior SOS methods.
A framework for worstcase and stochastic safety verification using barrier certificates
 IEEE TRANSACTIONS ON AUTOMATIC CONTROL
, 2007
"... This paper presents a methodology for safety verification of continuous and hybrid systems in the worstcase and stochastic settings. In the worstcase setting, a function of state termed barrier certificate is used to certify that all trajectories of the system starting from a given initial set do ..."
Abstract

Cited by 41 (1 self)
 Add to MetaCart
This paper presents a methodology for safety verification of continuous and hybrid systems in the worstcase and stochastic settings. In the worstcase setting, a function of state termed barrier certificate is used to certify that all trajectories of the system starting from a given initial set do not enter an unsafe region. No explicit computation of reachable sets is required in the construction of barrier certificates, which makes it possible to handle nonlinearity, uncertainty, and constraints directly within this framework. In the stochastic setting, our method computes an upper bound on the probability that a trajectory of the system reaches the unsafe set, a bound whose validity is proven by the existence of a barrier certificate. For polynomial systems, barrier certificates can be constructed using convex optimization, and hence the method is computationally tractable. Some examples are provided to illustrate the use of the method.
Semidefinite representation of convex sets
, 2007
"... Let S = {x ∈ R n: g1(x) ≥ 0, · · · , gm(x) ≥ 0} be a semialgebraic set defined by multivariate polynomials gi(x). Assume S is compact, convex and has nonempty interior. Let Si = {x ∈ R n: gi(x) ≥ 0} and ∂Si = {x ∈ R n: gi(x) = 0} be its boundary. This paper, as does the subject of semidefin ..."
Abstract

Cited by 39 (9 self)
 Add to MetaCart
Let S = {x ∈ R n: g1(x) ≥ 0, · · · , gm(x) ≥ 0} be a semialgebraic set defined by multivariate polynomials gi(x). Assume S is compact, convex and has nonempty interior. Let Si = {x ∈ R n: gi(x) ≥ 0} and ∂Si = {x ∈ R n: gi(x) = 0} be its boundary. This paper, as does the subject of semidefinite programming (SDP), concerns Linear Matrix Inequalities (LMIs). The set S is said to have an LMI representation if it equals the set of solutions to some LMI and it is known that some convex S may not be LMI representable [6]. A question arising from [13], see [6, 14], is: given S ∈ R n, does there exist an LMI representable set ˆ S in some higher dimensional space R n+N whose projection down onto R n equals S. Such S is called semidefinite representable or SDP representable. This paper addresses the SDP representability problem. The following are the main contributions of this paper: (i) Assume gi(x) are all concave on S. If the positive definite Lagrange Hessian (PDLH) condition holds, i.e., the Hessian of the Lagrange function for optimization problem of minimizing any nonzero linear function ℓ T x on S is positive definite at the minimizer, then S is SDP representable. (ii) If each gi(x) is either sosconcave (− ∇ 2 gi(x) = W(x) T W(x) for some matrix polynomial W(x)) or strictly quasiconcave on S, then S is SDP representable. (iii) If each Si is either sosconvex or poscurvconvex (Si is compact, convex and has smooth boundary with positive curvature), then S is SDP representable. This also holds for Si for which ∂Si ∩ S extends smoothly to the boundary of a poscurvconvex set containing S. (iv) We give the complexity of Schmüdgen and Putinar’s matrix Positivstellensatz, which are critical to the proofs of (i)(iii).
There are significantly more nonnegative polynomials than sums of squares, arXiv preprint math.AG/0309130
, 2003
"... We investigate the quantitative relationship between nonnegative polynomials and sums of squares of polynomials. We show that if the degree is fixed and the number of variables grows then there are significantly more nonnegative polynomials than sums of squares. More specifically, we take compact ba ..."
Abstract

Cited by 36 (6 self)
 Add to MetaCart
(Show Context)
We investigate the quantitative relationship between nonnegative polynomials and sums of squares of polynomials. We show that if the degree is fixed and the number of variables grows then there are significantly more nonnegative polynomials than sums of squares. More specifically, we take compact bases of the cone of nonnegative polynomials and the cone of sums of squares and derive bounds for the volumes of the bases. If the degree is greater than 2 then we show that the ratio of the volumes of the bases, raised to the power reciprocal to the ambient dimension, tends to 0 as the number of variables tends to infinity. 1
Sparsity in Sums of Squares of Polynomials
, 2004
"... Representation of a given nonnegative multivariate polynomial in terms of a sum of squares of polynomials has become an essential subject in recent developments of sums of squares optimization and SDP (semidefinite programming) relaxation of polynomial optimization problems. We discuss effective met ..."
Abstract

Cited by 35 (19 self)
 Add to MetaCart
Representation of a given nonnegative multivariate polynomial in terms of a sum of squares of polynomials has become an essential subject in recent developments of sums of squares optimization and SDP (semidefinite programming) relaxation of polynomial optimization problems. We discuss effective methods to obtain a simpler representation of a “sparse” polynomial as a sum of squares of sparse polynomials by eliminating redundancy.