Results 1 
5 of
5
Semidefinite Representations for Finite Varieties
 MATHEMATICAL PROGRAMMING
, 2002
"... We consider the problem of minimizing a polynomial over a semialgebraic set defined by polynomial equalities and inequalities. When the polynomial equalities have a finite number of complex solutions and define a radical ideal we can reformulate this problem as a semidefinite programming prob ..."
Abstract

Cited by 37 (7 self)
 Add to MetaCart
We consider the problem of minimizing a polynomial over a semialgebraic set defined by polynomial equalities and inequalities. When the polynomial equalities have a finite number of complex solutions and define a radical ideal we can reformulate this problem as a semidefinite programming problem. This semidefinite program involves combinatorial moment matrices, which are matrices indexed by a basis of the quotient vector space R[x 1 , . . . , x n ]/I. Our arguments are elementary and extend known facts for the grid case including 0/1 and polynomial programming. They also relate to known algebraic tools for solving polynomial systems of equations with finitely many complex solutions. Semidefinite approximations can be constructed by considering truncated combinatorial moment matrices; rank conditions are given (in a grid case) that ensure that the approximation solves the original problem at optimality.
Revisiting Two Theorems of Curto and Fialkow on Moment Matrices
, 2004
"... We revisit two results of Curto and Fialkow on moment matrices. The first result asserts that every sequence... ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
We revisit two results of Curto and Fialkow on moment matrices. The first result asserts that every sequence...
Approximation algorithms for homogeneous polynomial optimization with quadratic constraints
, 2009
"... In this paper, we consider approximation algorithms for optimizing a generic multivariate homogeneous polynomial function, subject to homogeneous quadratic constraints. Such optimization models have wide applications, e.g., in signal processing, magnetic resonance imaging (MRI), data training, appr ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
In this paper, we consider approximation algorithms for optimizing a generic multivariate homogeneous polynomial function, subject to homogeneous quadratic constraints. Such optimization models have wide applications, e.g., in signal processing, magnetic resonance imaging (MRI), data training, approximation theory, and portfolio selection. Since polynomial functions are nonconvex in general, the problems under consideration are all NPhard. In this paper we shall focus on polynomialtime approximation algorithms. In particular, we first study optimization of a multilinear tensor function over the Cartesian product of spheres. We shall propose approximation algorithms for such problem and derive worstcase performance ratios, which are shown to depend only on the dimensions of the models. The methods are then extended to optimize a generic multivariate homogeneous polynomial function with spherical constraints. Likewise, approximation algorithms are proposed with provable relative approximation performance ratios. Furthermore, the constraint set is relaxed to be an intersection of cocentered ellipsoids. In particular, we consider maximization of a homogeneous polynomial over the intersection of ellipsoids centered at the origin, and propose polynomialtime approximation algorithms with provable worstcase performance ratios. Numerical results are reported, illustrating the effectiveness of the approximation algorithms studied.
Tensor Principal Component Analysis via Convex Optimization
, 2012
"... This paper is concerned with the computation of the principal components for a general tensor, known as the tensor principal component analysis (PCA) problem. We show that the general tensor PCA problem is reducible to its special case where the tensor in question is supersymmetric with an even degr ..."
Abstract
 Add to MetaCart
This paper is concerned with the computation of the principal components for a general tensor, known as the tensor principal component analysis (PCA) problem. We show that the general tensor PCA problem is reducible to its special case where the tensor in question is supersymmetric with an even degree. In that case, the tensor can be embedded into a symmetric matrix. We prove that if the tensor is rankone, then the embedded matrix must be rankone too, and vice versa. The tensor PCA problem can thus be solved by means of matrix optimization under a rankone constraint, for which we propose two solution methods: (1) imposing a nuclear norm penalty in the objective to enforce a lowrank solution; (2) relaxing the rankone constraint by Semidefinite Programming. Interestingly, our experiments show that both methods yield a rankone solution with high probability, thereby solving the original tensor PCA problem to optimality with high probability. To further cope with the size of the resulting convex optimization models, we propose to use the alternating direction method of multipliers, which reduces significantly the computational efforts. Various extensions of the model are considered as well.
nonconvex constraints and its extensions
, 2011
"... algorithms for trilinear optimization with ..."