Results 1  10
of
11
On a Homogeneous Algorithm for the Monotone Complementarity Problem
 Mathematical Programming
, 1995
"... We present a generalization of a homogeneous selfdual linear programming (LP) algorithm to solving the monotone complementarity problem (MCP). The algorithm does not need to use any "bigM" parameter or twophase method, and it generates either a solution converging towards feasibility an ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
We present a generalization of a homogeneous selfdual linear programming (LP) algorithm to solving the monotone complementarity problem (MCP). The algorithm does not need to use any "bigM" parameter or twophase method, and it generates either a solution converging towards feasibility and complementarity simultaneously or a certificate proving infeasibility. Moreover, if the MCP is polynomially solvable with an interior feasible starting point, then it can be polynomially solved without using or knowing such information at all. To our knowledge, this is the first interiorpoint and infeasiblestarting algorithm for solving the MCP that possesses these desired features. Preliminary computational results are presented. Key words: Monotone complementarity problem, homogeneous and selfdual, infeasiblestarting algorithm. Running head: A homogeneous algorithm for MCP. Department of Management, Odense University, Campusvej 55, DK5230 Odense M, Denmark, email: eda@busieco.ou.dk. y De...
A Global Convergence Analysis Of An Algorithm For Large Scale Nonlinear Optimization Problems
, 1996
"... . In this paper we give a global convergence analysis of a basic version of an SQP algorithm described in [2] for the solution of large scale nonlinear inequalityconstrained optimization problems. Several procedures and options have been added to the basic algorithm to improve the practical perform ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
. In this paper we give a global convergence analysis of a basic version of an SQP algorithm described in [2] for the solution of large scale nonlinear inequalityconstrained optimization problems. Several procedures and options have been added to the basic algorithm to improve the practical performance; some of these are also analyzed. The important features of the algorithm include the use of a constrained merit function to assess the progress of the iterates and a sequence of approximate merit functions that are less expensive to evaluate. It also employs an interior point quadratic programming solver that can be terminated early to produce a truncated step. Key words. Sequential Quadratic Programming, Global Convergence, Merit Function, Large Scale Problems. AMS subject classifications. 49M37, 65K05, 90C30 1. Introduction. In this report we consider an algorithm to solve the inequalityconstrained minimization problem, min x f(x) subject to: g(x) 0; (1.1) where x 2 R n , and...
A Computational Study of the Homogeneous Algorithm for LargeScale Convex Optimization
, 1997
"... Recently the authors have proposed a homogeneous and selfdual algorithm for solving the monotone complementarity problem (MCP) [5]. The algorithm is a single phase interiorpoint type method, nevertheless it yields either an approximate optimal solution or detects a possible infeasibility of th ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
Recently the authors have proposed a homogeneous and selfdual algorithm for solving the monotone complementarity problem (MCP) [5]. The algorithm is a single phase interiorpoint type method, nevertheless it yields either an approximate optimal solution or detects a possible infeasibility of the problem. In this paper we specialize the algorithm to the solution of general smooth convex optimization problems that also possess nonlinear inequality constraints and free variables. We discuss an implementation of the algorithm for largescale sparse convex optimization. Moreover, we present computational results for solving quadratically constrained quadratic programming and geometric programming problems, where some of the problems contain more than 100,000 constraints and variables. The results indicate that the proposed algorithm is also practically efficient. Department of Management, Odense University, Campusvej 55, DK5230 Odense M, Denmark. Email: eda@busieco.ou.dk y ...
Superlinear and Quadratic Convergence of AffineScaling InteriorPoint Newton Methods for Problems with Simple Bounds without Strict Complementarity Assumption
, 1998
"... A class of affinescaling interiorpoint methods for bound constrained optimization problems is introduced which are locally qsuperlinear or qquadratic convergent. It is assumed that the strong... ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
A class of affinescaling interiorpoint methods for bound constrained optimization problems is introduced which are locally qsuperlinear or qquadratic convergent. It is assumed that the strong...
Computational Experience of an InteriorPoint SQP Algorithm in a Parallel BranchandBound Framework
"... An interiorpoint algorithm within a parallel branchandbound framework for solving nonlinear mixed integer programs is described. The nonlinear programming relaxations at each node are solved using an interior point SQP method. In contrast to solving the relaxation to optimality at each tree node ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
An interiorpoint algorithm within a parallel branchandbound framework for solving nonlinear mixed integer programs is described. The nonlinear programming relaxations at each node are solved using an interior point SQP method. In contrast to solving the relaxation to optimality at each tree node, the relaxation is only solved to nearoptimality. Analogous to employing advanced bases in simplexbased linear MIP solvers, a “dynamic” collection of warmstart vectors is kept to provide “advanced warmstarts” at each branchandbound node. The code has the capability to run in both sharedmemory and distributedmemory parallel environments. Preliminary computational results on various classes of linear mixed integer programs and quadratic portfolio problems are presented.
On Effectively Computing the Analytic Center of the Solution Set By PrimalDual InteriorPoint Methods
, 1997
"... The computation of the analytic center of the solution set can be important in linear programming applications where it is desirable to obtain a solution that is not near the relative boundary of the solution set. In this work we discuss the effective computation of the analytic center solution by t ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
The computation of the analytic center of the solution set can be important in linear programming applications where it is desirable to obtain a solution that is not near the relative boundary of the solution set. In this work we discuss the effective computation of the analytic center solution by the use of primaldual interiorpoint methods. A primaldual interiorpoint algorithm designed for effectively computing the analyticcenter solution is proposed and numerical results are presented. This research was partially supported by NSF Cooperative Agreement No. CCR88 09615, NSF Grant DMS 9305760, ARO Grant 9DAAL0390G0093 DOE Grant DEFG0586ER25017, and AFOSR Grant 890363. y Department of Computational and Applied Mathematics and Center for Research on Parallel Computation, Rice University, Houston, TX 772511892. Partially supported by Fulbright/LASPAU. z Department of Computational and Applied Mathematics and Center for Research on Parallel Computation, Rice University,...
Primaldual optimization methods in neural networks and support vector machines training
, 1999
"... Recently a lot of attention has been given to applications of mathematical programming to machine learning and neural networks. In this tutorial we investigate the use of Interior Point Methods (IPMs) to Support Vector Machines (SVMs) and Arti cial Neural Networks (ANNs) training. The training of AN ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Recently a lot of attention has been given to applications of mathematical programming to machine learning and neural networks. In this tutorial we investigate the use of Interior Point Methods (IPMs) to Support Vector Machines (SVMs) and Arti cial Neural Networks (ANNs) training. The training of ANNs is a highly nonconvex optimization problem in contrast to the SVMs training problem which isaconvex optimization problem. Speci cally, training a SVM is equivalent to solving a linearly constrained quadratic programming (QP) problem in a number of variables equal to the number of data points. This problem becomes quite challenging when the size of the data becomes of the order of some thousands. IPMs have beenshown quite promising for solving large scale linear and quadratic programming problems. We focus on primaldual IPMs applied to SVMs and neural networks and investigate the problem of reducing its computational complexity. We also develop a new class of incremental nonlinear primaldual techniques for arti cial neural training and provide preliminary experimental results for nancial forecasting problems.
RICE UNIVERSITY The Use of Optimization Techniques in the Solution of Partial Differential Equations from
, 1996
"... Acknowledgments This thesis is a very important milestone in a journey I began more than ten years ago. People too numerous to mention have helped me along the way; a few are singled out here. When I was an undergraduate at the University of Maryland, Baltimore County, the Mathematics faculty, in pa ..."
Abstract
 Add to MetaCart
Acknowledgments This thesis is a very important milestone in a journey I began more than ten years ago. People too numerous to mention have helped me along the way; a few are singled out here. When I was an undergraduate at the University of Maryland, Baltimore County, the Mathematics faculty, in particular Professors James Greenberg, So/ren Jensen, and Marc Teboulle, taught me to love applied mathematics; their patience with me was endless and I will always be grateful to them.
A computational study of the homogeneous algorithm for largescale convex optimization
, 1996
"... Key words: Monotone complementarity problem, homogeneous and selfdual model, interiorpoint algorithms, largescale convex optimization. 1 Introduction In 1984 Karmarkar [31] presented an interiorpoint method for linear programming (LP) and since then interiorpoint algorithms enjoyed great public ..."
Abstract
 Add to MetaCart
Key words: Monotone complementarity problem, homogeneous and selfdual model, interiorpoint algorithms, largescale convex optimization. 1 Introduction In 1984 Karmarkar [31] presented an interiorpoint method for linear programming (LP) and since then interiorpoint algorithms enjoyed great publicity for two reasons. First, these algorithms solve LP problems in polynomial time, as proved by Karmarkar and many others. Secondly, interiorpoint algorithms have demonstrated excellent practical performance when solving largescale LP problems, see Lustig et al. [37]. It was soon realized (see Gill et al. [25]) that Karmarkar's method was closely related to the logarithmic barrier algorithm for general nonlinear programming studied by Fiacco and McCormick [23] and others in the sixties. Hence, it is natural to investigate the efficiency of the interiorpoint methods for solving more general classes of problems. In general good complexity results could only be expected for solving convex optimization problems.