Results 1  10
of
44
Robust convex optimization
 Mathematics of Operations Research
, 1998
"... We study convex optimization problems for which the data is not specified exactly and it is only known to belong to a given uncertainty set U, yet the constraints must hold for all possible values of the data from U. The ensuing optimization problem is called robust optimization. In this paper we la ..."
Abstract

Cited by 265 (22 self)
 Add to MetaCart
We study convex optimization problems for which the data is not specified exactly and it is only known to belong to a given uncertainty set U, yet the constraints must hold for all possible values of the data from U. The ensuing optimization problem is called robust optimization. In this paper we lay the foundation of robust convex optimization. In the main part of the paper we show that if U is an ellipsoidal uncertainty set, then for some of the most important generic convex optimization problems (linear programming, quadratically constrained programming, semidefinite programming and others) the corresponding robust convex program is either exactly, or approximately, a tractable problem which lends itself to efficient algorithms such as polynomial time interior point methods.
Robust solutions to uncertain linear programs
 OR Letters
, 1999
"... We consider linear programs with uncertain parameters, lying in some prescribed uncertainty set, where part of the variables must be determined before the realization of the uncertain parameters (”nonadjustable variables”), while the other part are variables that can be chosen after the realization ..."
Abstract

Cited by 231 (14 self)
 Add to MetaCart
We consider linear programs with uncertain parameters, lying in some prescribed uncertainty set, where part of the variables must be determined before the realization of the uncertain parameters (”nonadjustable variables”), while the other part are variables that can be chosen after the realization (”adjustable variables”). We extend the Robust Optimization methodology ([1, 4, 5, 6, 7, 9, 13, 14]) to this situation by introducing the Adjustable Robust Counterpart (ARC) associated with an LP of the above structure. Often the ARC is significantly less conservative than the usual Robust Counterpart (RC), however, in most cases the ARC is computationally intractable (NPhard). This difficulty is addressed by restricting the adjustable variables to be affine functions of the uncertain data. The ensuing Affinely Adjustable Robust Counterpart (AARC) problem is then shown to be, in certain important cases, equivalent to a tractable optimization problem (typically an LP or a Semidefinite problem), and in other cases, having a tight approximation which is tractable. The AARC approach is illustrated by applying it to a multistage inventory management problem.
Robust Solutions To LeastSquares Problems With Uncertain Data
, 1997
"... . We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpret ..."
Abstract

Cited by 145 (12 self)
 Add to MetaCart
. We consider leastsquares problems where the coefficient matrices A; b are unknownbutbounded. We minimize the worstcase residual error using (convex) secondorder cone programming, yielding an algorithm with complexity similar to one singular value decomposition of A. The method can be interpreted as a Tikhonov regularization procedure, with the advantage that it provides an exact bound on the robustness of solution, and a rigorous way to compute the regularization parameter. When the perturbation has a known (e.g., Toeplitz) structure, the same problem can be solved in polynomialtime using semidefinite programming (SDP). We also consider the case when A; b are rational functions of an unknownbutbounded perturbation vector. We show how to minimize (via SDP) upper bounds on the optimal worstcase residual. We provide numerical examples, including one from robust identification and one from robust interpolation. Key Words. Leastsquares, uncertainty, robustness, secondorder cone...
Robust discrete optimization and network flows
 Mathematical Programming Series B
, 2003
"... We propose an approach to address data uncertainty for discrete optimization and network flow problems that allows controlling the degree of conservatism of the solution, and is computationally tractable both practically and theoretically. In particular, when both the cost coefficients and the data ..."
Abstract

Cited by 125 (23 self)
 Add to MetaCart
We propose an approach to address data uncertainty for discrete optimization and network flow problems that allows controlling the degree of conservatism of the solution, and is computationally tractable both practically and theoretically. In particular, when both the cost coefficients and the data in the constraints of an integer programming problem are subject to uncertainty, we propose a robust integer programming problem of moderately larger size that allows controlling the degree of conservatism of the solution in terms of probabilistic bounds on constraint violation. When only the cost coefficients are subject to uncertainty and the problem is a 0 − 1 discrete optimization problem on n variables, then we solve the robust counterpart by solving at most n+1 instances of the original problem. Thus, the robust counterpart of a polynomially solvable 0−1 discrete optimization problem remains polynomially solvable. In particular, robust matching, spanning tree, shortest path, matroid intersection, etc. are polynomially solvable. We also show that the robust counterpart of an NPhard αapproximable 0 − 1 discrete optimization problem, remains αapproximable. Finally, we propose an algorithm for robust network flows that solves the robust counterpart by solving a polynomial number of nominal minimum cost flow problems in a modified network.
Robust solutions of Linear Programming problems contaminated with uncertain data
 Mathematical Programming
, 2000
"... Optimal solutions of Linear Programming problems may become severely infeasible if the nominal data is slightly perturbed. We demonstrate this phenomenon by studying 90 LPs from the wellknown NETLIB collection. We then apply the Robust Optimization methodology (BenTal and Nemirovski [13]; El Ghao ..."
Abstract

Cited by 101 (6 self)
 Add to MetaCart
Optimal solutions of Linear Programming problems may become severely infeasible if the nominal data is slightly perturbed. We demonstrate this phenomenon by studying 90 LPs from the wellknown NETLIB collection. We then apply the Robust Optimization methodology (BenTal and Nemirovski [13]; El Ghaoui et al. [5,6]) to produce “robust ” solutions of the above LPs which are in a sense immuned against uncertainty. Surprisingly, for the NETLIB problems these robust solutions nearly lose nothing in optimality. 1
Robust Semidefinite Programming” – in
 Handbook on Semidefinite Programming, Kluwer Academis Publishers
"... In this paper, we consider semidefinite programs where the data is only known to belong to some uncertainty set U. Following recent work by the authors, we develop the notion of robust solution to such problems, which are required to satisfy the (uncertain) constraints whatever the value of the data ..."
Abstract

Cited by 37 (17 self)
 Add to MetaCart
In this paper, we consider semidefinite programs where the data is only known to belong to some uncertainty set U. Following recent work by the authors, we develop the notion of robust solution to such problems, which are required to satisfy the (uncertain) constraints whatever the value of the data in U. Even when the decision variable is fixed, checking robust feasibility is in general NPhard. For a number of uncertainty sets U, we show how to compute robust solutions, based on a sufficient condition for robust feasibility, via SDP. We detail some cases when the sufficient condition is also necessary, such as linear programming or convex quadratic programming with ellipsoidal uncertainty. Finally, we provide examples, taken from interval computations and truss topology design. 1
Tractable approximations of robust conic optimization problems
, 2006
"... In earlier proposals, the robust counterpart of conic optimization problems exhibits a lateral increase in complexity, i.e., robust linear programming problems (LPs) become second order cone problems (SOCPs), robust SOCPs become semidefinite programming problems (SDPs), and robust SDPs become NPha ..."
Abstract

Cited by 34 (11 self)
 Add to MetaCart
In earlier proposals, the robust counterpart of conic optimization problems exhibits a lateral increase in complexity, i.e., robust linear programming problems (LPs) become second order cone problems (SOCPs), robust SOCPs become semidefinite programming problems (SDPs), and robust SDPs become NPhard. We propose a relaxed robust counterpart for general conic optimization problems that (a) preserves the computational tractability of the nominal problem; specifically the robust conic optimization problem retains its original structure, i.e., robust LPs remain LPs, robust SOCPs remain SOCPs and robust SDPs remain SDPs, and (b) allows us to provide a guarantee on the probability that the robust solution is feasible when the uncertain coefficients obey independent and identically distributed normal distributions.
On tractable approximations of uncertain linear matrix inequalities affected by interval uncertainty
 SIAM Journal on Optimization
, 2002
"... Abstract. We present efficiently verifiable sufficient conditions for the validity of specific NPhard semiinfinite systems of Linear Matrix Inequalities (LMI’s) arising from LMI’s with uncertain data and demonstrate that these conditions are “tight ” up to an absolute constant factor. In particular ..."
Abstract

Cited by 34 (10 self)
 Add to MetaCart
Abstract. We present efficiently verifiable sufficient conditions for the validity of specific NPhard semiinfinite systems of Linear Matrix Inequalities (LMI’s) arising from LMI’s with uncertain data and demonstrate that these conditions are “tight ” up to an absolute constant factor. In particular, we prove that given an n × n interval matrix Uρ = {A  Aij − A ∗ ij  ≤ ρCij}, one can build a computable lower bound, accurate within the factor π, on the supremum of those ρ for which 2 all instances of Uρ share a common quadratic Lyapunov function. We then obtain a similar result for the problem of Quadratic Lyapunov Stability Synthesis. Finally, we apply our techniques to the problem of maximizing a homogeneous polynomial of degree 3 over the unit cube. Key words. Robust semidefinite optimization, data uncertainty, Lyapunov stability synthesis, relaxations of combinatorial problems AMS subject classifications. 90C05, 90C25, 90C30
A Robust Optimization Perspective Of Stochastic Programming
, 2005
"... In this paper, we introduce an approach for constructing uncertainty sets for robust optimization using new deviation measures for bounded random variables known as the forward and backward deviations. These deviation measures capture distributional asymmetry and lead to better approximations of c ..."
Abstract

Cited by 27 (9 self)
 Add to MetaCart
In this paper, we introduce an approach for constructing uncertainty sets for robust optimization using new deviation measures for bounded random variables known as the forward and backward deviations. These deviation measures capture distributional asymmetry and lead to better approximations of chance constraints. We also propose a tractable robust optimization approach for obtaining robust solutions to a class of stochastic linear optimization problems where the risk of infeasibility can be tolerated as a tradeoff to improve upon the objective value. An attractive feature of the framework is the computational scalability to multiperiod models. We show an application of the framework for solving a project management problem with uncertain activity completion time.