Results 1  10
of
38
Interior methods for nonlinear optimization
 SIAM Review
, 2002
"... Abstract. Interior methods are an omnipresent, conspicuous feature of the constrained optimization landscape today, but it was not always so. Primarily in the form of barrier methods, interiorpoint techniques were popular during the 1960s for solving nonlinearly constrained problems. However, their ..."
Abstract

Cited by 108 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Interior methods are an omnipresent, conspicuous feature of the constrained optimization landscape today, but it was not always so. Primarily in the form of barrier methods, interiorpoint techniques were popular during the 1960s for solving nonlinearly constrained problems. However, their use for linear programming was not even contemplated because of the total dominance of the simplex method. Vague but continuing anxiety about barrier methods eventually led to their abandonment in favor of newly emerging, apparently more efficient alternatives such as augmented Lagrangian and sequential quadratic programming methods. By the early 1980s, barrier methods were almost without exception regarded as a closed chapter in the history of optimization. This picture changed dramatically with Karmarkar’s widely publicized announcement in 1984 of a fast polynomialtime interior method for linear programming; in 1985, a formal connection was established between his method and classical barrier methods. Since then, interior methods have advanced so far, so fast, that their influence has transformed both the theory and practice of constrained optimization. This article provides a condensed, selective look at classical material and recent research about interior methods for nonlinearly constrained optimization.
The Quadratic Assignment Problem: A Survey and Recent Developments
 In Proceedings of the DIMACS Workshop on Quadratic Assignment Problems, volume 16 of DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1994
"... . Quadratic Assignment Problems model many applications in diverse areas such as operations research, parallel and distributed computing, and combinatorial data analysis. In this paper we survey some of the most important techniques, applications, and methods regarding the quadratic assignment probl ..."
Abstract

Cited by 106 (16 self)
 Add to MetaCart
(Show Context)
. Quadratic Assignment Problems model many applications in diverse areas such as operations research, parallel and distributed computing, and combinatorial data analysis. In this paper we survey some of the most important techniques, applications, and methods regarding the quadratic assignment problem. We focus our attention on recent developments. 1. Introduction Given a set N = f1; 2; : : : ; ng and n \Theta n matrices F = (f ij ) and D = (d kl ), the quadratic assignment problem (QAP) can be stated as follows: min p2\Pi N n X i=1 n X j=1 f ij d p(i)p(j) + n X i=1 c ip(i) ; where \Pi N is the set of all permutations of N . One of the major applications of the QAP is in location theory where the matrix F = (f ij ) is the flow matrix, i.e. f ij is the flow of materials from facility i to facility j, and D = (d kl ) is the distance matrix, i.e. d kl represents the distance from location k to location l [62, 67, 137]. The cost of simultaneously assigning facility i to locat...
Complete search in continuous global optimization and constraint satisfaction
 ACTA NUMERICA 13
, 2004
"... ..."
(Show Context)
Quadratic Optimization
, 1995
"... . Quadratic optimization comprises one of the most important areas of nonlinear programming. Numerous problems in real world applications, including problems in planning and scheduling, economies of scale, and engineering design, and control are naturally expressed as quadratic problems. Moreover, t ..."
Abstract

Cited by 54 (3 self)
 Add to MetaCart
. Quadratic optimization comprises one of the most important areas of nonlinear programming. Numerous problems in real world applications, including problems in planning and scheduling, economies of scale, and engineering design, and control are naturally expressed as quadratic problems. Moreover, the quadratic problem is known to be NPhard, which makes this one of the most interesting and challenging class of optimization problems. In this chapter, we review various properties of the quadratic problem, and discuss different techniques for solving various classes of quadratic problems. Some of the more successful algorithms for solving the special cases of bound constrained and large scale quadratic problems are considered. Examples of various applications of quadratic programming are presented. A summary of the available computational results for the algorithms to solve the various classes of problems is presented. Key words: Quadratic optimization, bilinear programming, concave pro...
Continuous Characterizations of the Maximum Clique Problem
 Math. Oper. Res
, 1996
"... Given a graph G whose adjacency matrix is A, the MotzkinStrauss formulation of the MaximumClique Problem is the quadratic program maxfx T Axjx T e = 1; x 0g. It is well known that the global optimum value of this QP is (1 \Gamma 1=!(G)), where !(G) is the clique number of G. Here, we characte ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
(Show Context)
Given a graph G whose adjacency matrix is A, the MotzkinStrauss formulation of the MaximumClique Problem is the quadratic program maxfx T Axjx T e = 1; x 0g. It is well known that the global optimum value of this QP is (1 \Gamma 1=!(G)), where !(G) is the clique number of G. Here, we characterize the following: 1) first order optimality 2) second order optimality 3) local optimality 4) strict local. These characterizations reveal interesting underlying discrete structures, and are polynomial time verifiable. A parametrization of the MotzkinStrauss QP is then introduced and its properties are investigated. Finally, an extension of the MotzkinStrauss formulation is provided for the weighted clique number of a graph and this is used to derive a maximin characterization of perfect graphs. 1 Introduction 1.1 The Problem of Interest Let A be the adjacency matrix of a graph G. Consider the MotzkinStrauss formulation (also called the MotzkinStrauss QP) of the Maximum Clique Pro...
Inertiacontrolling methods for general quadratic programming
 SIAM Review
, 1991
"... Abstract. Activeset quadratic programming (QP) methods use a working set to define the search direction and multiplier estimates. In the method proposed by Fletcher in 1971, and in several subsequent mathematically equivalent methods, the working set is chosen to control the inertia of the reduced ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Activeset quadratic programming (QP) methods use a working set to define the search direction and multiplier estimates. In the method proposed by Fletcher in 1971, and in several subsequent mathematically equivalent methods, the working set is chosen to control the inertia of the reduced Hessian, which is never permitted to have more than one nonpositive eigenvalue. (We call such methods inertiacontrolling.) This paper presents an overview of a generic inertiacontrolling QP method, including the equations satisfied by the search direction when the reduced Hessian is positive definite, singular and indefinite. Recurrence relations are derived that define the search direction and Lagrange multiplier vector through equations related to the KarushKuhnTucker system. We also discuss connections with inertiacontrolling methods that maintain an explicit factorization of the reduced Hessian matrix. Key words. Nonconvex quadratic programming, activeset methods, Schur complement, Karush
User’s Guide For QPOPT 1.0: A Fortran Package For Quadratic Programming
, 1995
"... QPOPT is a set of Fortran subroutines for minimizing a general quadratic function subject to linear constraints and simple upper and lower bounds. QPOPT may also be used for linear programming and for finding a feasible point for a set of linear equalities and inequalities. If the quadratic function ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
(Show Context)
QPOPT is a set of Fortran subroutines for minimizing a general quadratic function subject to linear constraints and simple upper and lower bounds. QPOPT may also be used for linear programming and for finding a feasible point for a set of linear equalities and inequalities. If the quadratic function is convex (i.e., the Hessian is positive definite or positive semidefinite), the solution obtained will be a global minimizer. If the quadratic is nonconvex (i.e., the Hessian is indefinite), the solution obtained will be a local minimizer or a deadpoint. A twophase activeset method is used. The first phase minimizes the sum of infeasibilities. The second phase minimizes the quadratic function within the feasible region, using a reduced Hessian to obtain search directions. The method is most efficient when many constraints or bounds are active at the solution. QPOPT is not intended for large sparse problems, but there is no fixed limit on problem size. The source code is suitable for all scientific machines with a Fortran 77
Canonical dual approach for solving 01 quadratic programming problems
 J. Industrial and Management Optimization
, 2007
"... Abstract. By using the canonical dual transformation developed recently, we derive a pair of canonical dual problems for 01 quadratic programming problems in both minimization and maximization form. Regardless convexity, when the canonical duals are solvable, no duality gap exists between the prima ..."
Abstract

Cited by 22 (12 self)
 Add to MetaCart
(Show Context)
Abstract. By using the canonical dual transformation developed recently, we derive a pair of canonical dual problems for 01 quadratic programming problems in both minimization and maximization form. Regardless convexity, when the canonical duals are solvable, no duality gap exists between the primal and corresponding dual problems. Both global and local optimality conditions are given. An algorithm is presented for finding global minimizers, even when the primal objective function is not convex. Examples are included to illustrate this new approach.
Piecewise Sequential Quadratic Programming For Mathematical Programs With . . .
"... We describe some first and secondorder optimality conditions for mathematical programs with equilibrium constraints (MPEC). Mathematical programs with parametric nonlinear complementarity constraints are the focus. Of interest is the result that under a linear independence assumption that is stand ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
We describe some first and secondorder optimality conditions for mathematical programs with equilibrium constraints (MPEC). Mathematical programs with parametric nonlinear complementarity constraints are the focus. Of interest is the result that under a linear independence assumption that is standard in nonlinear programming, the otherwise combinatorial problem of checking whether a point is stationary for an MPEC is reduced to checking stationarity of single nonlinear program. We also present a piecewise sequential quadratic programming (PSQP) algorithm for solving MPEC. Local quadratic convergence is shown under the linear independence assumption and a secondorder sufficient condition. Some computational results are given. KEY WORDS MPEC, bilevel program, nonlinear complementarity problem, nonlinear program, first and secondorder optimality conditions, linear independence constraint qualification, sequential quadratic programming, quadratic convergence. 2 Chapter 1 1 INTRODUC...
SecondOrder Sufficient Optimality Conditions for Local and Global Nonlinear Programming
, 1994
"... This paper presents a new approach to the sufficient conditions of nonlinear programming. Main result is a sufficient condition for the global optimality of a KuhnTucker point. This condition can be verified constructively, using a novel convexity test based on interval analysis, and is guaranteed ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
This paper presents a new approach to the sufficient conditions of nonlinear programming. Main result is a sufficient condition for the global optimality of a KuhnTucker point. This condition can be verified constructively, using a novel convexity test based on interval analysis, and is guaranteed to prove global optimality of strong local minimizers for sufficiently narrow bounds. Hence it is expected to be a useful tool within branch and bound algorithms for global optimization.