Results 1  10
of
35
Convex Nondifferentiable Optimization: A Survey Focussed On The Analytic Center Cutting Plane Method.
, 1999
"... We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct pr ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct proofs based on the properties of the logarithmic function. We also provide an in depth analysis of two extensions that are very relevant to practical problems: the case of multiple cuts and the case of deep cuts. We further examine extensions to problems including feasible sets partially described by an explicit barrier function, and to the case of nonlinear cuts. Finally, we review several implementation issues and discuss some applications.
Multiple Cuts in the Analytic Center Cutting Plane Method
, 1998
"... We analyze the multiple cut generation scheme in the analytic center cutting plane method. We propose an optimal primal and dual updating direction when the cuts are central. The direction is optimal in the sense that it maximizes the product of the new dual slacks and of the new primal variables wi ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
We analyze the multiple cut generation scheme in the analytic center cutting plane method. We propose an optimal primal and dual updating direction when the cuts are central. The direction is optimal in the sense that it maximizes the product of the new dual slacks and of the new primal variables within the trust regions defined by Dikin's primal and dual ellipsoids. The new primal and dual directions use the variancecovariance matrix of the normals to the new cuts in the metric given by Dikin's ellipsoid. We prove that the recovery of a new analytic center from the optimal restoration direction can be done in O(p log(p + 1)) damped Newton steps, where p is the number of new cuts added by the oracle, which may vary with the iteration. The results and the proofs are independent of the specific scaling matrix primal, dual or primaldual that is used in the computations. The computation of the optimal direction uses Newton's method applied to a selfconcordant function of p variab...
The Analytic Center Cutting Plane Method with Semidefinite Cuts
 SIAM JOURNAL ON OPTIMIZATION
, 2000
"... We analyze an analytic center cutting plane algorithm for the convex feasibility problems with semidefinite cuts. At each iteration the oracle returns a pdimensional semidefinite cut at an approximate analytic center of the set of localization. The set of localization, which contains the solution s ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
We analyze an analytic center cutting plane algorithm for the convex feasibility problems with semidefinite cuts. At each iteration the oracle returns a pdimensional semidefinite cut at an approximate analytic center of the set of localization. The set of localization, which contains the solution set, is a compact set consists of piecewise algebraic surfaces. We prove that the analytic center is recovered after adding a pdimensional cut in O(p log(p 1)) damped Newton's iteration. We also prove that the algorithm stops when the dimension of the accumulated block diagonal matrix cut reaches to the bound of O (p 2 m 3 =ffl 2 ), where p is the maximum dimension cut and ffl is radius of the largest ball contained in the solution set.
Polynomial interior point cutting plane methods
 Optimization Methods and Software
, 2003
"... Polynomial cutting plane methods based on the logarithmic barrier function and on the volumetric center are surveyed. These algorithms construct a linear programming relaxation of the feasible region, find an appropriate approximate center of the region, and call a separation oracle at this approxim ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
Polynomial cutting plane methods based on the logarithmic barrier function and on the volumetric center are surveyed. These algorithms construct a linear programming relaxation of the feasible region, find an appropriate approximate center of the region, and call a separation oracle at this approximate center to determine whether additional constraints should be added to the relaxation. Typically, these cutting plane methods can be developed so as to exhibit polynomial convergence. The volumetric cutting plane algorithm achieves the theoretical minimum number of calls to a separation oracle. Longstep versions of the algorithms for solving convex optimization problems are presented. 1
INTERIOR POINT METHODS FOR COMBINATORIAL OPTIMIZATION
, 1995
"... Research on using interior point algorithms to solve combinatorial optimization and integer programming problems is surveyed. This paper discusses branch and cut methods for integer programming problems, a potential reduction method based on transforming an integer programming problem to an equivale ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
Research on using interior point algorithms to solve combinatorial optimization and integer programming problems is surveyed. This paper discusses branch and cut methods for integer programming problems, a potential reduction method based on transforming an integer programming problem to an equivalent nonconvex quadratic programming problem, interior point methods for solving network flow problems, and methods for solving multicommodity flow problems, including an interior point column generation algorithm.
A multiplecut analytic center cutting plane method for semidefinite feasibility problems
 SIAM Journal on Optimization
, 2002
"... form of these problems can be described as finding a point in a nonempty bounded convex body Γ in the cone of symmetric positive semidefinite matrices. Assume that Γ is defined by an oracle, which for any given m × m symmetric positive semidefinite matrix ˆ Y either confirms that ˆ Y ∈ Γ or returns ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
form of these problems can be described as finding a point in a nonempty bounded convex body Γ in the cone of symmetric positive semidefinite matrices. Assume that Γ is defined by an oracle, which for any given m × m symmetric positive semidefinite matrix ˆ Y either confirms that ˆ Y ∈ Γ or returns a cut, i.e., a symmetric matrix A such that Γ is in the halfspace {Y: A • Y ≤ A • ˆ Y}. We study an analytic center cutting plane algorithm for this problem. At each iteration the algorithm computes an approximate analytic center of a working set defined by the cuttingplane system generated in the previous iterations. If this approximate analytic center is a solution, then the algorithm terminates; otherwise the new cutting plane returned by the oracle is added into the system. As the number of iterations increases, the working set shrinks and the algorithm eventually finds a solution of the problem. All iterates generated by the algorithm are positive definite matrices. The algorithm has a worst case complexity of O ∗ (m 3 /ɛ 2) on the total number of cuts to be used, where ɛ is the maximum radius of a ball contained by Γ.
On the Comparative Behavior of Kelley's Cutting Plane Method and the Analytic Center Cutting plane Method
, 1996
"... In this paper, we explore a weakness of a specific implementation of the analytic center cutting plane method applied to convex optimization problems, which may lead to weaker results than Kelley's cutting plane method. Improvements to the analytic center cutting plane method are suggested. 1 Introd ..."
Abstract

Cited by 12 (8 self)
 Add to MetaCart
In this paper, we explore a weakness of a specific implementation of the analytic center cutting plane method applied to convex optimization problems, which may lead to weaker results than Kelley's cutting plane method. Improvements to the analytic center cutting plane method are suggested. 1 Introduction In this paper, we explore a weakness of a specific implementation of the analytic center cutting plane method, and propose improvements. Cutting plane algorithms are designed to solve general convex optimization problems. They assume that the only information available around the current iterate takes the form of cutting planes, either supporting hyperplanes to the epigraph of the objective function, or separating hyperplanes from the feasible set. The two types of hyperplanes jointly define a linear programming, polyhedral, relaxation of the original convex optimization problem. The key issue in designing a specific cutting plane algorithm is the choice of a point in the current poly...
Optimizing call center staffing using simulation and analytic center cutting plane methods
 Management Science
, 2005
"... We consider the problem of minimizing staffing costs in an inbound call center, while maintaining an acceptable level of service in multiple time periods. The problem is complicated by the fact that staffing level in one time period can affect the service levels in subsequent periods. Moreover, sta ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
We consider the problem of minimizing staffing costs in an inbound call center, while maintaining an acceptable level of service in multiple time periods. The problem is complicated by the fact that staffing level in one time period can affect the service levels in subsequent periods. Moreover, staff schedules typically take the form of shifts covering several periods. Interactions between staffing levels in different time periods, as well as the impact of shift requirements on the staffing levels and cost should be considered in the planning. Traditional staffing methods based on stationary queueing formulas do not take this into account. We present a simulationbased analytic center cutting plane method to solve a sample average approximation of the problem. We establish convergence of the method when the service level functions are discrete pseudoconcave. An extensive numerical study of a moderately large call center shows that the method is robust and, in most of the test cases, outperforms traditional staffing heuristics that are based on analytical queueing methods.
Complexity Analysis Of A Logarithmic Barrier Decomposition Methods For SemiInfinite Linear Programming
, 1997
"... In this paper, we analyze a logarithmic barrier decomposition method for solving a semiinfinite linear programming problem. This method is in some respects similar to the column generation methods using analytic centers. Although the method was found to be very efficient in the recent computational ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
In this paper, we analyze a logarithmic barrier decomposition method for solving a semiinfinite linear programming problem. This method is in some respects similar to the column generation methods using analytic centers. Although the method was found to be very efficient in the recent computational studies, its theoretical convergence or complexity is still unknown except in the (finite) case of linear programming. In this paper we present a complexity analysis of this method in the general semiinfinite case. Our complexity estimate is given in terms of the problem dimension, the radius of the largest Euclidean ball contained in the feasible set, and the desired accuracy of the approximate solution. KEY WORDS. Semiinfinite linear programming, logarithmic barrier, decomposition, column generation. AMS subject classification: 90C25, 90C60. iii 1 Introduction Consider the following semiinfinite linear programming problem (SILP): maximize f 0 (y) := a T 0 y subject to f t (y) := ...
Interior Point Methods with Decomposition for Linear Programs
 JOTA
, 1996
"... This paper deals with an algorithm which incorporates the interior point method into the DantzigWolfe decomposition technique for solving largescale linear programming problems. At each iteration, the algorithm performs one step of Newton's method to solve a subproblem, obtaining an approximate so ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
This paper deals with an algorithm which incorporates the interior point method into the DantzigWolfe decomposition technique for solving largescale linear programming problems. At each iteration, the algorithm performs one step of Newton's method to solve a subproblem, obtaining an approximate solution, which is then used to compute an approximate Newton direction to find a new vector of the Lagrange multipliers. We show that the algorithm is globally linearly convergent and has the polynomialtime complexity. Key Words: Largescale linear programming, Interior point methods, DantzigWolfe decomposition, Complexity. Abbreviated Title: Interior point methods with decomposition AMS(MOS) subject classifications: 90C05, 90C06, 90C60. 1. Introduction This paper presents and analyzes an algorithm which incorporates the interior point method into the DantzigWolfe decomposition method. Our concern in this paper is to show the polynomialtime complexity of the algorithm. In order to explo...