Results 1  10
of
148
Semidefinite Programming Relaxations for Semialgebraic Problems
, 2001
"... A hierarchy of convex relaxations for semialgebraic problems is introduced. For questions reducible to a finite number of polynomial equalities and inequalities, it is shown how to construct a complete family of polynomially sized semidefinite programming conditions that prove infeasibility. The mai ..."
Abstract

Cited by 359 (22 self)
 Add to MetaCart
A hierarchy of convex relaxations for semialgebraic problems is introduced. For questions reducible to a finite number of polynomial equalities and inequalities, it is shown how to construct a complete family of polynomially sized semidefinite programming conditions that prove infeasibility. The main tools employed are a semidefinite programming formulation of the sum of squares decomposition for multivariate polynomials, and some results from real algebraic geometry. The techniques provide a constructive approach for finding bounded degree solutions to the Positivstellensatz, and are illustrated with examples from diverse application fields.
A comparison of the SheraliAdams, LovászSchrijver and Lasserre relaxations for 01 programming
 Mathematics of Operations Research
, 2001
"... ..."
(Show Context)
SOSTOOLS: Sum of squares optimization toolbox for MATLAB
, 2004
"... Version 2.00 ..."
(Show Context)
Introducing SOSTOOLS: A General Purpose Sum of Squares Programming Solver
 Proceedings of the IEEE Conference on Decision and Control (CDC), Las Vegas, NV
, 2002
"... SOSTOOLS is a MATLAB toolbox for constructing and solving sum of squares programs. It can be used in combination with semidefinite programming software, such as SeDuMi, to solve many continuous and combinatorial optimization problems, as well as various controlrelated problems. This paper provides ..."
Abstract

Cited by 72 (15 self)
 Add to MetaCart
(Show Context)
SOSTOOLS is a MATLAB toolbox for constructing and solving sum of squares programs. It can be used in combination with semidefinite programming software, such as SeDuMi, to solve many continuous and combinatorial optimization problems, as well as various controlrelated problems. This paper provides an overview on sum of squares programming, describes the primary features of SOSTOOLS, and shows how SOSTOOLS is used to solve sum of squares programs. Some applications from different areas are presented to show the wide applicability of sum of squares programming in general and SOSTOOLS in particular. 1
Sums of squares of regular functions on real algebraic varieties
 Tran. Amer. Math. Soc
, 1999
"... Abstract. Let V be an affine algebraic variety over R (or any other real closed field R). We ask when it is true that every positive semidefinite (psd) polynomial function on V is a sum of squares (sos). We show that for dim V ≥ 3 the answer is always negative if V has a real point. Also, if V is a ..."
Abstract

Cited by 67 (16 self)
 Add to MetaCart
(Show Context)
Abstract. Let V be an affine algebraic variety over R (or any other real closed field R). We ask when it is true that every positive semidefinite (psd) polynomial function on V is a sum of squares (sos). We show that for dim V ≥ 3 the answer is always negative if V has a real point. Also, if V is a smooth nonrational curve all of whose points at infinity are real, the answer is again negative. The same holds if V is a smooth surface with only real divisors at infinity. The “compact ” case is harder. We completely settle the case of smooth curves of genus ≤ 1: If such a curve has a complex point at infinity, then every psd function is sos, provided the field R is archimedean. If R is not archimedean, there are counterexamples of genus 1.
Minimizing polynomials via sum of squares over the gradient ideal
 Math. Program
"... A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown ..."
Abstract

Cited by 54 (18 self)
 Add to MetaCart
A method is proposed for finding the global minimum of a multivariate polynomial via sum of squares (SOS) relaxation over its gradient variety. That variety consists of all points where the gradient is zero and it need not be finite. A polynomial which is nonnegative on its gradient variety is shown to be SOS modulo its gradient ideal, provided the gradient ideal is radical or the polynomial is strictly positive on the gradient variety. This opens up the possibility of solving previously intractable polynomial optimization problems. The related problem of constrained minimization is also considered, and numerical examples are discussed. Experiments show that our method using the gradient variety outperforms prior SOS methods.
A framework for worstcase and stochastic safety verification using barrier certificates
 IEEE TRANSACTIONS ON AUTOMATIC CONTROL
, 2007
"... This paper presents a methodology for safety verification of continuous and hybrid systems in the worstcase and stochastic settings. In the worstcase setting, a function of state termed barrier certificate is used to certify that all trajectories of the system starting from a given initial set do ..."
Abstract

Cited by 52 (1 self)
 Add to MetaCart
This paper presents a methodology for safety verification of continuous and hybrid systems in the worstcase and stochastic settings. In the worstcase setting, a function of state termed barrier certificate is used to certify that all trajectories of the system starting from a given initial set do not enter an unsafe region. No explicit computation of reachable sets is required in the construction of barrier certificates, which makes it possible to handle nonlinearity, uncertainty, and constraints directly within this framework. In the stochastic setting, our method computes an upper bound on the probability that a trajectory of the system reaches the unsafe set, a bound whose validity is proven by the existence of a barrier certificate. For polynomial systems, barrier certificates can be constructed using convex optimization, and hence the method is computationally tractable. Some examples are provided to illustrate the use of the method.
Semidefinite representation of convex sets
, 2007
"... Let S = {x ∈ R n: g1(x) ≥ 0, · · · , gm(x) ≥ 0} be a semialgebraic set defined by multivariate polynomials gi(x). Assume S is compact, convex and has nonempty interior. Let Si = {x ∈ R n: gi(x) ≥ 0} and ∂Si = {x ∈ R n: gi(x) = 0} be its boundary. This paper, as does the subject of semidefin ..."
Abstract

Cited by 49 (11 self)
 Add to MetaCart
Let S = {x ∈ R n: g1(x) ≥ 0, · · · , gm(x) ≥ 0} be a semialgebraic set defined by multivariate polynomials gi(x). Assume S is compact, convex and has nonempty interior. Let Si = {x ∈ R n: gi(x) ≥ 0} and ∂Si = {x ∈ R n: gi(x) = 0} be its boundary. This paper, as does the subject of semidefinite programming (SDP), concerns Linear Matrix Inequalities (LMIs). The set S is said to have an LMI representation if it equals the set of solutions to some LMI and it is known that some convex S may not be LMI representable [6]. A question arising from [13], see [6, 14], is: given S ∈ R n, does there exist an LMI representable set ˆ S in some higher dimensional space R n+N whose projection down onto R n equals S. Such S is called semidefinite representable or SDP representable. This paper addresses the SDP representability problem. The following are the main contributions of this paper: (i) Assume gi(x) are all concave on S. If the positive definite Lagrange Hessian (PDLH) condition holds, i.e., the Hessian of the Lagrange function for optimization problem of minimizing any nonzero linear function ℓ T x on S is positive definite at the minimizer, then S is SDP representable. (ii) If each gi(x) is either sosconcave (− ∇ 2 gi(x) = W(x) T W(x) for some matrix polynomial W(x)) or strictly quasiconcave on S, then S is SDP representable. (iii) If each Si is either sosconvex or poscurvconvex (Si is compact, convex and has smooth boundary with positive curvature), then S is SDP representable. This also holds for Si for which ∂Si ∩ S extends smoothly to the boundary of a poscurvconvex set containing S. (iv) We give the complexity of Schmüdgen and Putinar’s matrix Positivstellensatz, which are critical to the proofs of (i)(iii).
"Positive" noncommutative polynomials are sums of squares
 ANN. OF MATH
, 2002
"... Hilbert's 17th problem concerns expressing polynomials on R n as a sum of squares. It is well known that many positive polynomials are not sums of squares; see [R00] [deA preprt] for excellent surveys. In this paper we consider symmetric noncommutative polynomials and call one \matrix positive ..."
Abstract

Cited by 46 (8 self)
 Add to MetaCart
(Show Context)
Hilbert's 17th problem concerns expressing polynomials on R n as a sum of squares. It is well known that many positive polynomials are not sums of squares; see [R00] [deA preprt] for excellent surveys. In this paper we consider symmetric noncommutative polynomials and call one \matrix positive", if whenever matrices of any size are substituted for the variables in the polynomial the matrix value which the polynomial takes is positive semidenite. The result in this paper is: A polynomial is matrix positive if and only if it is a sum of squares.