Results 1  10
of
13
Robust discrete optimization under ellipsoidal uncertainty sets
, 2004
"... We address the complexity and practically e cient methods for robust discrete optimization under ellipsoidal uncertainty sets. Speci cally, weshowthat the robust counterpart of a discrete optimization problem with correlated objective function data is NPhard even though the nominal problem is polyn ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
We address the complexity and practically e cient methods for robust discrete optimization under ellipsoidal uncertainty sets. Speci cally, weshowthat the robust counterpart of a discrete optimization problem with correlated objective function data is NPhard even though the nominal problem is polynomially solvable. For uncorrelated and identically distributed data, however, we show that the robust problem retains the complexity of the nominal problem. For uncorrelated, but not identically distributed data we propose an approximation method that solves the robust problem within arbitrary accuracy. Wealso propose a FrankWolfe type algorithm for this case, whichwe prove converges to a locally optimal solution, and in computational experiments is remarkably e ective. Finally,we propose a generalization of the robust discrete optimization framework we proposed earlier that (a) allows the key parameter that controls the tradeo between robustness and optimality to depend on the solution and (b) results in increased exibility and decreased conservatism, while maintaining the complexity of the nominal problem.
Solving the Minimum Spanning Tree Problem in Stochastic Graphs Using Learning Automata
 Proceedings of International Conference on Information Management and Engineering (ICIME 2009), Kuala Lumpur
, 2009
"... Abstract — In this paper, we propose some learning automatabased algorithms to solve the minimum spanning tree problem in stochastic graphs when the probability distribution function of the edge's weight is unknown. In these algorithms, at each stage a set of learning automata determines which ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
Abstract — In this paper, we propose some learning automatabased algorithms to solve the minimum spanning tree problem in stochastic graphs when the probability distribution function of the edge's weight is unknown. In these algorithms, at each stage a set of learning automata determines which edges to be sampled. This sampling method may result in decreasing unnecessary samples and hence decreasing the running time of algorithms. The proposed algorithm reduces the number of samples needs to be taken by a sample average approximation method from the edges of the stochastic graph. It is shown that by proper choice of the parameter of the proposed algorithms, the probability that the algorithms find the optimal solution can be made as close to unity as possible. Keywordscomponent; Learning automata, Minimum spanning tree, Stochastic graph
Using Sparsification for Parametric Minimum Spanning Tree Problems
 Nordic J. Computing
, 1996
"... Two applications of sparsification to parametric computing are given. The first is a fast algorithm for enumerating all distinct minimum spanning trees in a graph whose edge weights vary linearly with a parameter. The second is an asymptotically optimal algorithm for the minimum ratio spanning t ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
Two applications of sparsification to parametric computing are given. The first is a fast algorithm for enumerating all distinct minimum spanning trees in a graph whose edge weights vary linearly with a parameter. The second is an asymptotically optimal algorithm for the minimum ratio spanning tree problem, as well as other search problems, on dense graphs. 1 Introduction In the parametric minimum spanning tree problem, one is given an nnode, medge undirected graph G where each edge e has a linear weight function w e (#)=a e +#b e . Let Z(#) denote the weight of the minimum spanning tree relative to the weights w e (#). It can be shown that Z(#) is a piecewise linear concave function of # [Gus80]; the points at which the slope of Z changes are called breakpoints. We shall present two results regarding parametric minimum spanning trees. First, we show that Z(#) can be constructed in O(min{nm log n, TMST (2n, n) # Department of Computer Science, Iowa State University, Ames, IA...
POLYMATROIDS AND MEANRISK MINIMIZATION IN DISCRETE OPTIMIZATION
, 2007
"... Abstract. In financial markets high levels of risk are associated with large returns as well as large losses, whereas with lower levels of risk, the potential for either return or loss is small. Therefore, risk management is fundamentally concerned with finding an optimal tradeoff between risk and r ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract. In financial markets high levels of risk are associated with large returns as well as large losses, whereas with lower levels of risk, the potential for either return or loss is small. Therefore, risk management is fundamentally concerned with finding an optimal tradeoff between risk and return matching an investor’s risk tolerance. Managing risk is studied mostly in a financial context; nevertheless, it is certainly relevant in any area with a significant source of uncertainty. The meanrisk tradeoff is wellstudied for problems with a convex feasible set. However, this is not the case in the discrete setting, even though, in practice, portfolios are often restricted to discrete choices. In this paper we study meanrisk minimization for problems with discrete decision variables. In particular, we consider discrete optimization problems with a submodular meanrisk minimization objective. We show the connection between extended polymatroids and the convex lower envelope of this meanrisk objective. For 01 problems a complete linear characterization of the convex lower envelope is given. For mixed 01 problems we derive an exponential class of conic quadratic inequalities that are separable with the greedy algorithm.
Maximizing expected utility for stochastic combinatorial optimization problems
 In FOCS
, 2011
"... We study the stochastic versions of a broad class of combinatorial problems where the weights of the elements in the input dataset are uncertain. The class of problems that we study includes shortest paths, minimum weight spanning trees, and minimum weight matchings over probabilistic graphs, and ot ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We study the stochastic versions of a broad class of combinatorial problems where the weights of the elements in the input dataset are uncertain. The class of problems that we study includes shortest paths, minimum weight spanning trees, and minimum weight matchings over probabilistic graphs, and other combinatorial problems like knapsack. We observe that the expected value is inadequate in capturing different types of riskaverse or riskprone behaviors, and instead we consider a more general objective which is to maximize the expected utility of the solution for some given utility function, rather than the expected weight (expected weight becomes a special case). We show that we can obtain a polynomial time approximation algorithm with additive error for any > 0, if there is a pseudopolynomial time algorithm for the exact version of the problem (This is true for the problems mentioned above) and the maximum value of the utility function is bounded by a constant. 1 Our result generalizes several prior results on stochastic shortest path, stochastic spanning tree, and stochastic knapsack. Our algorithm for utility maximization makes use of the separability of exponential utility and a technique to decompose a general utility function into exponential utility functions, which may be useful in other stochastic optimization problems. 1
Learning automatabased algorithms for solving stochastic minimum spanning tree problem
 Applied Soft Computing
, 2011
"... Due to the hardness of solving the minimum spanning tree (MST) problem in stochastic environments, the stochastic MST (SMST) problem has not received the attention it merits, specifically when the probability distribution function (PDF) of the edge weight is not a priori known. In this paper, we fir ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Due to the hardness of solving the minimum spanning tree (MST) problem in stochastic environments, the stochastic MST (SMST) problem has not received the attention it merits, specifically when the probability distribution function (PDF) of the edge weight is not a priori known. In this paper, we first propose a learning automata‐based sampling algorithm (Algorithm 1) to solve the MST problem in stochastic graphs where the PDF of the edge weight is assumed to be unknown. At each stage of the proposed algorithm, a set of learning automata is randomly activated and determines the graph edges that must be sampled in that stage. As the proposed algorithm proceeds, the sampling process focuses on the spanning tree with the minimum expected weight. Therefore, the proposed sampling method is capable of decreasing the rate of unnecessary samplings and shortening the time required for finding the SMST. The convergence of this algorithm is theoretically proved and it is shown that by a proper choice of the learning rate the spanning tree with the minimum expected weight can be found with a probability close enough to unity. Numerical results show that Algorithm 1 outperforms the standard sampling method. Selecting a proper learning rate is the most challenging issue in learning automata theory by which a good trade off can be achieved between the cost and efficiency of algorithm. To improve the efficiency (i.e., the convergence speed and convergence rate) of Algorithm 1, we
THE SUBMODULAR KNAPSACK POLYTOPE
 FORTHCOMING IN DISCRETE OPTIMIZATION
, 2009
"... The submodular knapsack set is the discrete lower level set of a submodular function. The modular case reduces to the classical linear 01 knapsack set. One motivation for studying the submodular knapsack polytope is to address 01 programming problems with uncertain coefficients. Under various as ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The submodular knapsack set is the discrete lower level set of a submodular function. The modular case reduces to the classical linear 01 knapsack set. One motivation for studying the submodular knapsack polytope is to address 01 programming problems with uncertain coefficients. Under various assumptions, a probabilistic constraint on 01 variables can be modeled as a submodular knapsack set. In this paper we describe cover inequalities for the submodular knapsack set and investigate their lifting problem. Each lifting problem is itself an optimization problem over a submodular knapsack set. We give sequenceindependent upper and lower bounds on the valid lifting coefficients and show that whereas the upper bound can be computed in polynomial time, the lower bound problem is N Phard. Furthermore, we present polynomial algorithms based on parametric linear programming and computational results for the conic quadratic 01 knapsack case.
A 01 Random Fuzzy Programming Problem Based on the Degree of Necessity and the Efficient Solution Method
"... Abstract—This paper considers a general 01 random fuzzy programming problem based on the degree of necessity including some previous 01 stochastic and fuzzy programming problems. The proposal problem is not a welldefined problem due to including random fuzzy variables. Therefore, by introducing c ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—This paper considers a general 01 random fuzzy programming problem based on the degree of necessity including some previous 01 stochastic and fuzzy programming problems. The proposal problem is not a welldefined problem due to including random fuzzy variables. Therefore, by introducing chance constraint and fuzzy goal for objective function, and considering the maximization for the degrees of necessity that the objective function value satisfies the fuzzy goal, main problem is transformed into a deterministic equivalent problem. Furthermore, by using the assumption that each random variable is distributed according to a normal distribution, the problem is equivalently transformed into a basic 01 programming problem, and the efficient strict solution method to find an optimal solution is constructed. Index Terms—01 programming problem, Random fuzzy variables, Degree of necessity, Relaxation problem