Results 1  10
of
27
Convergent SDPRelaxations in Polynomial Optimization with Sparsity
 SIAM Journal on Optimization
"... Abstract. We consider a polynomial programming problem P on a compact semialgebraic set K ⊂ R n, described by m polynomial inequalities gj(X) ≥ 0, and with criterion f ∈ R[X]. We propose a hierarchy of semidefinite relaxations in the spirit those of Waki et al. [9]. In particular, the SDPrelaxati ..."
Abstract

Cited by 58 (16 self)
 Add to MetaCart
Abstract. We consider a polynomial programming problem P on a compact semialgebraic set K ⊂ R n, described by m polynomial inequalities gj(X) ≥ 0, and with criterion f ∈ R[X]. We propose a hierarchy of semidefinite relaxations in the spirit those of Waki et al. [9]. In particular, the SDPrelaxation of order r has the following two features: (a) The number of variables is O(κ 2r) where κ = max[κ1, κ2] witth κ1 (resp. κ2) being the maximum number of variables appearing the monomials of f (resp. appearing in a single constraint gj(X) ≥ 0). (b) The largest size of the LMI’s (Linear Matrix Inequalities) is O(κ r). This is to compare with the respective number of variables O(n 2r) and LMI size O(n r) in the original SDPrelaxations defined in [11]. Therefore, great computational savings are expected in case of sparsity in the data {gj, f}, i.e. when κ is small, a frequent case in practical applications of interest. The novelty with respect to [9] is that we prove convergence to the global optimum of P when the sparsity pattern satisfies a condition often encountered in large size problems of practical applications, and known as the running intersection property in graph theory. In such cases, and as a byproduct, we also obtain a new representation result for polynomials positive on a basic closed semialgebraic set, a sparse version of Putinar’s Positivstellensatz [16]. 1.
LMI techniques for optimization over polynomials in control: a survey
 IEEE Transactions on Automatic Control
"... Abstract—Numerous tasks in control systems involve optimization problems over polynomials, and unfortunately these problems are in general nonconvex. In order to cope with this difficulty, linear matrix inequality (LMI) techniques have been introduced because they allow one to obtain bounds to the ..."
Abstract

Cited by 34 (17 self)
 Add to MetaCart
(Show Context)
Abstract—Numerous tasks in control systems involve optimization problems over polynomials, and unfortunately these problems are in general nonconvex. In order to cope with this difficulty, linear matrix inequality (LMI) techniques have been introduced because they allow one to obtain bounds to the sought solution by solving convex optimization problems and because the conservatism of these bounds can be decreased in general by suitably increasing the size of the problems. This survey aims to provide the reader with a significant overview of the LMI techniques that are used in control systems for tackling optimization problems over polynomials, describing approaches such as decomposition in sum of squares, Positivstellensatz, theory of moments, Pólya’s theorem, and matrix dilation. Moreover, it aims to provide a collection of the essential problems in control systems where these LMI techniques are used, such as stability and performance investigations in nonlinear systems, uncertain systems, timedelay systems, and genetic regulatory networks. It is expected that this survey may be a concise useful reference for all readers.
approximations of nonnegative polynomials via simple high degree perturbations
 Math. Z
"... Abstract. We show that every real polynomial f nonnegative on [−1, 1] n can be approximated in the l1norm of coefficients, by a sequence of polynomials {fεr} that are sums of squares. This complements the existence of s.o.s. approximations in the denseness result of Berg, Christensen and Ressel, as ..."
Abstract

Cited by 27 (14 self)
 Add to MetaCart
(Show Context)
Abstract. We show that every real polynomial f nonnegative on [−1, 1] n can be approximated in the l1norm of coefficients, by a sequence of polynomials {fεr} that are sums of squares. This complements the existence of s.o.s. approximations in the denseness result of Berg, Christensen and Ressel, as we provide a very simple and explicit approximation sequence. Then we show that if the moment problem holds for a basic closed semialgebraic set KS ⊂ R n with nonempty interior, then every polynomial nonnegative on KS can be approximated in a similar fashion by elements from the corresponding preordering. Finally, we show that the degree of the perturbation in the approximating sequence depends on ɛ as well as the degree and the size of coefficients of the nonnegative polynomial f, but not on the specific values of its coefficients. 1.
Global optimization of polynomials using gradient tentacles and sums of squares
 SIAM Journal on Optimization
"... We consider the problem of computing the global infimum of a real polynomial f on R n. Every global minimizer of f lies on its gradient variety, i.e., the algebraic subset of R n where the gradient of f vanishes. If f attains a minimum on R n, it is therefore equivalent to look for the greatest low ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
(Show Context)
We consider the problem of computing the global infimum of a real polynomial f on R n. Every global minimizer of f lies on its gradient variety, i.e., the algebraic subset of R n where the gradient of f vanishes. If f attains a minimum on R n, it is therefore equivalent to look for the greatest lower bound of f on its gradient variety. Nie, Demmel and Sturmfels proved recently a theorem about the existence of sums of squares certificates for such lower bounds. Based on these certificates, they find arbitrarily tight relaxations of the original problem that can be formulated as semidefinite programs and thus be solved efficiently. We deal here with the more general case when f is bounded from below but does not necessarily attain a minimum. In this case, the method of Nie, Demmel and Sturmfels might yield completely wrong results. In order to overcome this problem, we replace the gradient variety by larger semialgebraic subsets of R n which we call gradient tentacles. It now gets substantially harder to prove the existence of the necessary sums of squares certificates.
Sparse SOS relaxations for minimizing functions that are summations of small polynomials
 SIAM Journal On Optimization
, 2008
"... This paper discusses how to find the global minimum of functions that are summations of small polynomials (“small ” means involving a small number of variables). Some sparse sum of squares (SOS) techniques are proposed. We compare their computational complexity and lower bounds with prior SOS relaxa ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
(Show Context)
This paper discusses how to find the global minimum of functions that are summations of small polynomials (“small ” means involving a small number of variables). Some sparse sum of squares (SOS) techniques are proposed. We compare their computational complexity and lower bounds with prior SOS relaxations. Under certain conditions, we also discuss how to extract the global minimizers from these sparse relaxations. The proposed methods are especially useful in solving sparse polynomial system and nonlinear least squares problems. Numerical experiments are presented, which show that the proposed methods significantly improve the computational performance of prior methods for solving these problems. Lastly, we present applications of this sparsity technique in solving polynomial systems derived from nonlinear differential equations and sensor network localization. Key words: Polynomials, sum of squares (SOS), sparsity, nonlinear least squares, polynomial system, nonlinear differential equations, sensor network localization 1
Maximum block improvement and polynomial optimization
 SIAM Journal on Optimization
"... Abstract. In this paper we propose an efficient method for solving the spherically constrained homogeneous polynomial optimization problem. The new approach has the following three main ingredients. First, we establish a block coordinate descent type search method for nonlinear optimization, with t ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we propose an efficient method for solving the spherically constrained homogeneous polynomial optimization problem. The new approach has the following three main ingredients. First, we establish a block coordinate descent type search method for nonlinear optimization, with the novelty being that we only accept a block update that achieves the maximum improvement, hence the name of our new search method: Maximum Block Improvement (MBI). Convergence of the sequence produced by the MBI method to a stationary point is proven. Second, we establish that maximizing a homogeneous polynomial over a sphere is equivalent to its tensor relaxation problem, thus we can maximize a homogeneous polynomial function over a sphere by its tensor relaxation via the MBI approach. Third, we propose a scheme to reach a KKT point of the polynomial optimization, provided that a stationary solution for the relaxed tensor problem is available. Numerical experiments have shown that our new method works very efficiently: for a majority of the test instances that we have experimented with, the method finds the global optimal solution at a low computational cost.
SUFFICIENT CONDITIONS FOR A REAL POLYNOMIAL TO BE A SUM OF SQUARES
, 2007
"... Abstract. We provide explicit sufficient conditions for a polynomial f to be a sum of squares (s.o.s.), linear in the coefficients of f. All conditions are simple and provide an explicit description of a convex polyhedral subcone of the cone of s.o.s. polynomials of degree at most 2d. We also provid ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
(Show Context)
Abstract. We provide explicit sufficient conditions for a polynomial f to be a sum of squares (s.o.s.), linear in the coefficients of f. All conditions are simple and provide an explicit description of a convex polyhedral subcone of the cone of s.o.s. polynomials of degree at most 2d. We also provide a simple condition to ensure that f is s.o.s., possibly after adding a constant. 1.
Projection methods in conic optimization
"... Projection onto semidefinite positive matrices Consider the space Sn of symmetric nbyn matrices, equipped with the norm associated to the usual inner product 〈X,Y 〉 = ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Projection onto semidefinite positive matrices Consider the space Sn of symmetric nbyn matrices, equipped with the norm associated to the usual inner product 〈X,Y 〉 =
Regularization methods for sum of squares relaxations in large scale polynomial optimization
 Department of Mathematics, University of California
, 2009
"... We study how to solve sum of squares (SOS) and Lasserre’s relaxations for large scale polynomial optimization. When interiorpoint type methods are used, typically only small or moderately large problems could be solved. This paper proposes the regularization type methods which would solve significa ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
We study how to solve sum of squares (SOS) and Lasserre’s relaxations for large scale polynomial optimization. When interiorpoint type methods are used, typically only small or moderately large problems could be solved. This paper proposes the regularization type methods which would solve significantly larger problems. We first describe these methods for general conic semidefinite optimization, and then apply them to solve large scale polynomial optimization. Their efficiency is demonstrated by extensive numerical computations. In particular, a general dense quartic polynomial optimization with 100 variables would be solved on a regular computer, which is almost impossible by applying prior existing SOS solvers. Key words polynomial optimization, regularization methods, semidefinite programming, sum of squares, Lasserre’s relaxation AMS subject classification 65K05, 90C22 1