Results 1 
7 of
7
Robust Solutions To Uncertain Semidefinite Programs
 SIAM J. OPTIMIZATION
, 1998
"... In this paper we consider semidefinite programs (SDPs) whose data depend on some unknown but bounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible value of paramet ..."
Abstract

Cited by 82 (8 self)
 Add to MetaCart
In this paper we consider semidefinite programs (SDPs) whose data depend on some unknown but bounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible value of parameters within the given bounds. Assuming the data matrices are rational functions of the perturbation parameters, we show how to formulate sufficient conditions for a robust solution to exist as SDPs. When the perturbation is "full," our conditions are necessary and sufficient. In this case, we provide sufficient conditions which guarantee that the robust solution is unique and continuous (Hölderstable) with respect to the unperturbed problem's data. The approach can thus be used to regularize illconditioned SDPs. We illustrate our results with examples taken from linear programming, maximum norm minimization, polynomial interpolation, and integer programming.
Robust Solutions To Uncertain Semidefinite Programs
, 1998
"... In this paper we consider semidenite programs (SDPs) whose data depends on some unknownbutbounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible values of paramet ..."
Abstract

Cited by 57 (2 self)
 Add to MetaCart
In this paper we consider semidenite programs (SDPs) whose data depends on some unknownbutbounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible values of parameters within the given bounds. Assuming the data matrices are rational functions of the perturbation parameters, we show how to formulate sufficient conditions for a robust solution to exist, as SDPs. When the perturbation is "full", our conditions are necessary and sufficient. In this case, we provide sufficient conditions which guarantee that the robust solution is unique, and continuous (Hölderstable) with respect to the unperturbed problems' data. The approach can thus be used to regularize illconditioned SDPs. We illustrate our results with examples taken from linear programming, maximum norm minimization, polynomial interpolation and integer programming.
WORSTCASE VALUEATRISK AND ROBUST PORTFOLIO OPTIMIZATION: A CONIC PROGRAMMING APPROACH
, 2001
"... Classical formulations of the portfolio optimization problem, such as meanvariance or ValueatRisk (VaR) approaches, can result in a portfolio extremely sensitive to errors in the data, such as mean and covariance matrix of the returns. In this paper we propose a way to alleviate this problem in a ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
Classical formulations of the portfolio optimization problem, such as meanvariance or ValueatRisk (VaR) approaches, can result in a portfolio extremely sensitive to errors in the data, such as mean and covariance matrix of the returns. In this paper we propose a way to alleviate this problem in a tractable manner. We assume that the distribution of returns is partially known, in the sense that only bounds on the mean and covariance matrix are available. We define the worstcase ValueatRisk as the largest VaR attainable, given the partial information on the returns ’ distribution. We consider the problem of computing and optimizing the worstcase VaR, and we show that these problems can be cast as semidefinite programs. We extend our approach to various other partial information on the distribution, including uncertainty in factor models, support constraints, and relative entropy information.
socP: Software for SecondOrder Cone Programming
 Laboratory, Stanford University
, 1997
"... granted, provided that this entire notice is included in all copies of any software which is or includes a copy or modi cation of this software and in all copies of the supporting documentation for such software. This software is being provided \as is", without any express or implied warranty. ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
granted, provided that this entire notice is included in all copies of any software which is or includes a copy or modi cation of this software and in all copies of the supporting documentation for such software. This software is being provided \as is", without any express or implied warranty. In particular, the authors do not make any representation or warranty of any kind concerning the merchantability of this software or its tness for any particular purpose. 1 1
Robust Solutions to l_1, l_2, and ... Uncertain Linear Approximation Problems using Convex Optimization
"... We present minimax and stochastic formulations of some linear approximation problems with uncertain data in R equipped with the Euclidean (l 2 ), Absolutesum (l 1 ) or Chebyshev (l 1) norms. We then show that these problems can be solved using convex optimization. Our results parallel and extend ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We present minimax and stochastic formulations of some linear approximation problems with uncertain data in R equipped with the Euclidean (l 2 ), Absolutesum (l 1 ) or Chebyshev (l 1) norms. We then show that these problems can be solved using convex optimization. Our results parallel and extend the work of ElGhaoui and Lebret on robust least squares [3], and the work of BenTal and Nemirovski on robust conic convex optimization problems [1]. The theory presented here is useful for desensitizing solutions to illcontitioned problems, or for computing solutions that guarantee a certain performance in the presence of uncertainty in the data.
Control applications of nonlinear convex programming
 the 1997 IFAC Conference on Advanced Process Control
, 1998
"... Since 1984 there has been a concentrated e ort to develop e cient interiorpoint methods for linear programming (LP). In the last few years researchers have begun to appreciate a very important property of these interiorpoint methods (beyond their e ciency for LP): they extend gracefully to nonline ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Since 1984 there has been a concentrated e ort to develop e cient interiorpoint methods for linear programming (LP). In the last few years researchers have begun to appreciate a very important property of these interiorpoint methods (beyond their e ciency for LP): they extend gracefully to nonlinear convex optimization problems. New interiorpoint algorithms for problem classes such as semide nite programming (SDP) or secondorder cone programming (SOCP) are now approaching the extreme e ciency of modern linear programming codes. In this paper we discuss three examples of areas of control where our ability to e ciently solve nonlinear convex optimization problems opens up new applications. In the rst example we show how SOCP can be used to solve robust openloop optimal control problems. In the second example, we show how SOCP can be used to simultaneously design the setpoint and feedback gains for a controller, and compare this method with the more standard approach. Our nal application concerns analysis and synthesis via linear matrix inequalities and SDP. Submitted to a special issue of Journal of Process Control, edited by Y. Arkun & S. Shah, for papers presented at the 1997 IFAC Conference onAdvanced Process Control, June 1997, Ban. This and related papers available via anonymous FTP at
Design of Fractional Delay Filters Using Convex Optimization
 IEEE Workshop on Appl. of Signal Processing to Audio and Acoustics
, 1997
"... Fractional sample delay (FD) filters are useful and necessary in many applications, such as the accurate steering of acoustic arrays [1], [2], delay lines for physical models of musical instruments [3] [4], and time delay estimation[5]. This paper addresses the design of finite impulse response (FIR ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Fractional sample delay (FD) filters are useful and necessary in many applications, such as the accurate steering of acoustic arrays [1], [2], delay lines for physical models of musical instruments [3] [4], and time delay estimation[5]. This paper addresses the design of finite impulse response (FIR) FD filters. The problem will be posed as a convex optimization problem in which the maximum modulus of the complex error will be minized. Several design examples will be presented, along with an empirical formula for the filter order required to meet a given worst case group delay error specification. 1.