Results 1  10
of
21
Global minimization using an Augmented Lagrangian method with variable lowerlevel constraints
, 2007
"... A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global c ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global convergence to an εglobal minimizer of the original problem is proved. The subproblems are solved using the αBB method. Numerical experiments are presented.
REFORMULATIONS IN MATHEMATICAL PROGRAMMING: DEFINITIONS AND SYSTEMATICS
, 2008
"... A reformulation of a mathematical program is a formulation which shares some properties with, but is in some sense better than, the original program. Reformulations are important with respect to the choice and efficiency of the solution algorithms; furthermore, it is desirable that reformulations c ..."
Abstract

Cited by 19 (14 self)
 Add to MetaCart
A reformulation of a mathematical program is a formulation which shares some properties with, but is in some sense better than, the original program. Reformulations are important with respect to the choice and efficiency of the solution algorithms; furthermore, it is desirable that reformulations can be carried out automatically. Reformulation techniques are very common in mathematical programming but interestingly they have never been studied under a common framework. This paper attempts to move some steps in this direction. We define a framework for storing and manipulating mathematical programming formulations, give several fundamental definitions categorizing reformulations in essentially four types (optreformulations, narrowings, relaxations and approximations). We establish some theoretical results and give reformulation examples for each type.
Reformulations in Mathematical Programming: A Computational Approach
"... Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathema ..."
Abstract

Cited by 18 (13 self)
 Add to MetaCart
(Show Context)
Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathematical expressions of the parameters and decision variables, and therefore excludes optimization of blackbox functions. A reformulation of a mathematical program P is a mathematical program Q obtained from P via symbolic transformations applied to the sets of variables, objectives and constraints. We present a survey of existing reformulations interpreted along these lines, some example applications, and describe the implementation of a software framework for reformulation and optimization. 1
Comparison of Deterministic and Stochastic Approaches to global optimization
"... In this paper we compare two different approaches to nonconvex global optimization. The first one is a deterministic spatial BranchandBound algorithm (sBB), whereas the second approach is a quasi Monte Carlo (QMC) variant of a stochastic multi level single linkage (MLSL) algorithm. Both algorithms ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
In this paper we compare two different approaches to nonconvex global optimization. The first one is a deterministic spatial BranchandBound algorithm (sBB), whereas the second approach is a quasi Monte Carlo (QMC) variant of a stochastic multi level single linkage (MLSL) algorithm. Both algorithms apply to problems in a very general form and are not dependent on problem structure. The test suite we chose is fairly extensive in scope, in that it includes constrained and unconstrained problems, continuous and mixedinteger problems. The conclusion of the tests is that in general the QMC variant of the MLSL algorithm is more efficient, although in some instances the BranchandBound algorithm is capable of locating the global optimum of hard problems in just one iteration.
Reformulation in mathematical programming: an application to quantum chemistry
 DISCRETE APPLIED MATHEMATICS, ACCEPTED FOR PUBLICATION
, 2007
"... ..."
(Show Context)
ReformulationLinearization Methods for Global Optimization
 Journal of Global Optimization
, 1991
"... ..."
(Show Context)
Mathematical programmingbased approach to scheduling of communicating tasks
, 2004
"... We present a MILP mathematical programming formulation for static scheduling of dependent tasks onto homogeneous multiprocessor system of an arbitrary architecture with communication delays. We reduce the number of constraints by applying a Reduction Constraint reformulation to the model. We solve s ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We present a MILP mathematical programming formulation for static scheduling of dependent tasks onto homogeneous multiprocessor system of an arbitrary architecture with communication delays. We reduce the number of constraints by applying a Reduction Constraint reformulation to the model. We solve several smallscale instances of the reformulated problem by using CPLEX 8.1. Upper bounds are computed with the Variable Neighborhood Search metaheuristic applied directly to the graphbased formulation of the problem, whereas lower bounds are obtained by solving linear relaxations of the MILP formulation, further tightened by using load balancing and critical path method arguments.
Extending a CIP framework to solve MIQCPs
, 2010
"... This paper discusses how to build a solver for mixed integer quadratically constrained programs (MIQCPs) by extending a framework for constraint integer programming (CIP). The advantage of this approach is that we can utilize the full power of advanced MILP and CP technologies, in particular for th ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This paper discusses how to build a solver for mixed integer quadratically constrained programs (MIQCPs) by extending a framework for constraint integer programming (CIP). The advantage of this approach is that we can utilize the full power of advanced MILP and CP technologies, in particular for the linear relaxation and the discrete components of the problem. We use an outer approximation generated by linearization of convex constraints and linear underestimation of nonconvex constraints to relax the problem. Further, we give an overview of the reformulation, separation, and propagation techniques that are used to handle the quadratic constraints efficiently. We implemented these methods in the branchcutandprice framework SCIP. Computational experiments indicating the potential of the approach and evaluating the impact of the algorithmic components are provided.
On Intervalsubgradient and Nogood Cuts
 OPERATIONS RESEARCH LETTERS
, 2010
"... Intervalgradient cuts are (nonlinear) valid inequalities for nonconvex NLPs defined for constraints g(x) ≤ 0 with g being continuously differentiable in a box [x, ¯x]. In this paper we define intervalsubgradient cuts, a generalization to the case of nondifferentiable g, and show that nogood cuts ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Intervalgradient cuts are (nonlinear) valid inequalities for nonconvex NLPs defined for constraints g(x) ≤ 0 with g being continuously differentiable in a box [x, ¯x]. In this paper we define intervalsubgradient cuts, a generalization to the case of nondifferentiable g, and show that nogood cuts (which have the form ‖x−ˆx ‖ ≥ ε for some norm and positive constant ε) are a special case of intervalsubgradient cuts whenever the 1norm is used. We then briefly discuss what happens if other norms are used.