Results 1  10
of
72
A Cutting Plane Method from Analytic Centers for Stochastic Programming
 Mathematical Programming
, 1994
"... The stochastic linear programming problem with recourse has a dual block angular structure. It can thus be handled by Benders decomposition or by Kelley's method of cutting planes; equivalently the dual problem has a primal block angular structure and can be handled by DantzigWolfe decompositi ..."
Abstract

Cited by 52 (17 self)
 Add to MetaCart
(Show Context)
The stochastic linear programming problem with recourse has a dual block angular structure. It can thus be handled by Benders decomposition or by Kelley's method of cutting planes; equivalently the dual problem has a primal block angular structure and can be handled by DantzigWolfe decomposition the two approaches are in fact identical by duality. Here we shall investigate the use of the method of cutting planes from analytic centers applied to similar formulations. The only significant difference form the aforementioned methods is that new cutting planes (or columns, by duality) will be generated not from the optimum of the linear programming relaxation, but from the analytic center of the set of localization. 1 Introduction The study of optimization problems in the presence of uncertainty still taxes the limits of methodology and software. One of the most approachable settings is that of twostaged planning under uncertainty, in which a first stage decision has to be taken bef...
A Primaldual InteriorPoint Method for Linear Optimization Based on a New Proximity Function
, 2002
"... In this paper we present a generic primaldual interiorpoint algorithm for linear optimization in which the search direction depends on a univariate kernel function which is also used as proximity measure in the analysis of the algorithm. We present some powerful tools for the analysis of the alg ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
In this paper we present a generic primaldual interiorpoint algorithm for linear optimization in which the search direction depends on a univariate kernel function which is also used as proximity measure in the analysis of the algorithm. We present some powerful tools for the analysis of the algorithm under the assumption that the kernel function satisfies three easy to check and mild conditions (i.e., exponential convexity, superconvexity and monotonicity of the second derivative). The approach is demonstrated by introducing a new kernel function and showing that the corresponding largeupdate algorithm improves the iteration complexity with a factor n 1 4 when compared with the classical method, which is based on the use of the logarithmic barrier function.
High resolution pursuit for feature extraction
 Applied and Computational Harmonic Analysis
, 1998
"... Recently, adaptive approximation techniques have become popular for obtaining parsimonious representations of large classes of signals. These methods include method of frames, matching pursuit, and, most recently, basis pursuit. In this work, high resolution pursuit (HRP) is developed as an alterna ..."
Abstract

Cited by 33 (3 self)
 Add to MetaCart
(Show Context)
Recently, adaptive approximation techniques have become popular for obtaining parsimonious representations of large classes of signals. These methods include method of frames, matching pursuit, and, most recently, basis pursuit. In this work, high resolution pursuit (HRP) is developed as an alternative to existing function approximation techniques. Existing techniques do not always efficiently yield representations which are sparse and physically interpretable. HRP is an enhanced version of the matching pursuit algorithm and overcomes the shortcomings of the traditional matching pursuit algorithm by emphasizing local fit over global fit at each stage. Further, the HRP algorithm has the same order of complexity as matching pursuit. In this paper, the HRP algorithm is developed and demonstrated on ID functions. Convergence properties of HRP are also examined. HRP is also suitable for extracting features which may then be used in recognition. 2 1
Solving RealWorld Linear Ordering Problems . . .
, 1995
"... Cutting plane methods require the solution of a sequence of linear programs, where the solution to one provides a warm start to the next. A cutting plane algorithm for solving the linear ordering problem is described. This algorithm uses the primaldual interior point method to solve the linear prog ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
Cutting plane methods require the solution of a sequence of linear programs, where the solution to one provides a warm start to the next. A cutting plane algorithm for solving the linear ordering problem is described. This algorithm uses the primaldual interior point method to solve the linear programming relaxations. A point which is a good warm start for a simplexbased cutting plane algorithm is generally not a good starting point for an interior point method. Techniques used to improve the warm start include attempting to identify cutting planes early and storing an old feasible point, which is used to help recenter when cutting planes are added. Computational results are described for some realworld problems; the algorithm appears to be competitive with a simplexbased cutting plane algorithm.
PrimalDual TargetFollowing Algorithms for Linear Programming
 ANNALS OF OPERATIONS RESEARCH
, 1993
"... In this paper we propose a method for linear programming with the property that, starting from an initial noncentral point, it generates iterates that simultaneously get closer to optimality and closer to centrality. The iterates follow paths that in the limit are tangential to the central path. Al ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
In this paper we propose a method for linear programming with the property that, starting from an initial noncentral point, it generates iterates that simultaneously get closer to optimality and closer to centrality. The iterates follow paths that in the limit are tangential to the central path. Along with the convergence analysis we provide a general framework which enables us to analyze various primaldual algorithms in the literature in a short and uniform way.
Optimal experimental design and some related control problems
, 2008
"... This paper traces the strong relations between experimental design and control, such as the use of optimal inputs to obtain precise parameter estimation in dynamical systems and the introduction of suitably designed perturbations in adaptive control. The mathematical background of optimal experiment ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
This paper traces the strong relations between experimental design and control, such as the use of optimal inputs to obtain precise parameter estimation in dynamical systems and the introduction of suitably designed perturbations in adaptive control. The mathematical background of optimal experimental design is briefly presented, and the role of experimental design in the asymptotic properties of estimators is emphasized. Although most of the paper concerns parametric models, some results are also presented for statistical learning and prediction with nonparametric models.
Why a pure primal Newton barrier step may be infeasible
 SIAM Journal on Optimization
, 1995
"... ..."
(Show Context)
A polynomial primaldual Dikintype algorithm for linear programming
 FACULTY OF TECHNICAL MATHEMATICS AND COMPUTER SCIENCE, DELFT UNIVERSITY OF TECHNOLOGY
, 1993
"... In this paper we present a new primaldual affine scaling method for linear programming. The method yields a strictly complementary optimal solution pair, and also allows a polynomialtime convergence proof. The search direction is obtained by using the original idea of Dikin, namely by minimizing t ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
In this paper we present a new primaldual affine scaling method for linear programming. The method yields a strictly complementary optimal solution pair, and also allows a polynomialtime convergence proof. The search direction is obtained by using the original idea of Dikin, namely by minimizing the objective function (which is the duality gap in the primaldual case), over some suitable ellipsoid. This gives rise to completely new primaldual affine scaling directions, having no obvious relation with the search directions proposed in the literature so far. The new directions guarantee a significant decrease in the duality gap in each iteration, and at the same time they drive the iterates to the central path. In the analysis of our algorithm we use a barrier function which is the natural primaldual generalization of Karmarkar's potential function. The iteration bound is O(nL), which is a factor O(L) better than the iteration bound of an earlier primaldual affine scaling meth...