Results 1  10
of
15
Global minimization using an Augmented Lagrangian method with variable lowerlevel constraints
, 2007
"... A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global c ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global convergence to an εglobal minimizer of the original problem is proved. The subproblems are solved using the αBB method. Numerical experiments are presented.
GloptLab, a configurable framework for the rigorous global solution of quadratic constraint satisfaction problems
"... solution of quadratic constraint satisfaction problems ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
solution of quadratic constraint satisfaction problems
Validated linear relaxations and preprocessing: Some experiments, 2003. accepted for publication in
 SIAM J. Optim
"... Abstract. Based on work originating in the early 1970s, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation proc ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Abstract. Based on work originating in the early 1970s, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation process. This process decomposes the objective and constraints (if any) into convex and nonconvex unary and binary operations. The convex operations can be approximated arbitrarily well by appending additional constraints, while the domain must somehow be subdivided (in an overall branchandbound process or in some other local process) to handle nonconvex constraints. In general, a problem can be hard if even a single nonconvex term appears. However, certain nonconvex terms lead to easiertosolve problems than others. Recently, Neumaier, Lebbah, Michel, ourselves, and others have paved the way to utilizing such techniques in a validated context. In this paper, we present a symbolic preprocessing step that provides a measure of the intrinsic difficulty of a problem. Based on this step, one of two methods can be chosen to relax nonconvex terms. This preprocessing step is similar to a method previously proposed by Epperly and Pistikopoulos [J. Global Optim., 11 (1997), pp. 287–311] for determining subspaces in which to branch, but we present it from a different point of view that is amenable to simplification of the problem presented to the linear programming solver, and within a validated context. Besides an illustrative example, we have implemented general relaxations in a validated context, as well as the preprocessing technique, and we present experiments on a standard test set. Finally, we present conclusions.
Transposition theorems and qualificationfree optimality conditions
 SIAM J. Optimization
"... Abstract. New theorems of the alternative for polynomial constraints (based on the Positivstellensatz from real algebraic geometry) and for linear constraints (generalizing the transposition theorems of Motzkin and Tucker) are proved. Based on these, two KarushJohn optimality conditions – holding w ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. New theorems of the alternative for polynomial constraints (based on the Positivstellensatz from real algebraic geometry) and for linear constraints (generalizing the transposition theorems of Motzkin and Tucker) are proved. Based on these, two KarushJohn optimality conditions – holding without any constraint qualification – are proved for single or multiobjective constrained optimization problems. The first condition applies to polynomial optimization problems only, and gives for the first time necessary and sufficient global optimality conditions for polynomial problems. The second condition applies to smooth local optimization problems and strengthens known local conditions. If some linear or concave constraints are present, the new version reduces the number of constraints for which a constraint qualification is needed to get the KuhnTucker conditions.
Improved and simplified validation of feasible points: Inequality and equality constrained problems
 Mathematical Programming, submitted
, 2005
"... Abstract. In validated branch and bound algorithms for global optimization, upper bounds on the global optimum are obtained by evaluating the objective at an approximate optimizer; the upper bounds are then used to eliminate subregions of the search space. For constrained optimization, in general, a ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. In validated branch and bound algorithms for global optimization, upper bounds on the global optimum are obtained by evaluating the objective at an approximate optimizer; the upper bounds are then used to eliminate subregions of the search space. For constrained optimization, in general, a small region must be constructed within which existence of a feasible point can be proven, and an upper bound on the objective over that region is obtained. We had previously proposed a perturbation technique for constructing such a region. In this work, we propose a much simplified and improved technique, based on an orthogonal decomposition of the normal space to the constraints. In purely inequality constrained problems, a point, rather than a region, can be used, and, for equality and inequality constrained problems, the region lies in a smallerdimensional subspace, giving rise to sharper upper bounds. Numerical experiments on published test sets for global optimization provide evidence of the superiority of the new approach within our GlobSol environment. 1.
The Optimization Test Environment
"... Testing is a crucial part of software development in general, and hence also in mathematical programming. Unfortunately, it is often a time consuming and little exciting activity. This naturally motivated us to increase the e ciency in testing solvers for optimization problems and to automatize as m ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Testing is a crucial part of software development in general, and hence also in mathematical programming. Unfortunately, it is often a time consuming and little exciting activity. This naturally motivated us to increase the e ciency in testing solvers for optimization problems and to automatize as much of the procedure as possible. Keywords: test environment, optimization, solver benchmarking, solver comparison The testing procedure typically consists of three basic tasks: a) organize test problem sets, also called test libraries; b) solve selected test problems with selected solvers; c) analyze, check and compare the results. The Test Environment is a graphical user interface (GUI) that enables to manage the tasks a) and b) interactively, and task c) automatically. The Test Environment is particularly designed for users who seek to 1. adjust solver parameters, or 2. compare solvers on single problems, or 3. evaluate solvers on suitable test sets.
Improving interval enclosures
, 2009
"... This paper serves as background information for the Vienna proposal for interval standardization, explaining what is needed in practice to make competent use of the interval arithmetic provided by an implementation of the standard to be. Discussed are methods to improve the quality of interval encl ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper serves as background information for the Vienna proposal for interval standardization, explaining what is needed in practice to make competent use of the interval arithmetic provided by an implementation of the standard to be. Discussed are methods to improve the quality of interval enclosures of the range of a function over a box, considerations of possible hardware support facilitating the implementation of such methods, and the results of a simple interval challenge that I had posed to the reliable computing mailing list on November 26, 2008. Also given is an example of a bound constrained global optimization problem in 4 variables that has a 2dimensional continuum of global minimizers. This makes standard branch and bound codes extremely slow, and therefore may serve as a useful degenerate test problem.
Capabilities of Constraint Programming in Safe Global Optimization ∗†‡
"... We investigate the capabilities of constraints programming techniques in rigorous global optimization methods. We introduce different constraint programming techniques to reduce the gap between efficient but unsafe systems like Baron 1, and safe but slow global optimization approaches. We show how c ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We investigate the capabilities of constraints programming techniques in rigorous global optimization methods. We introduce different constraint programming techniques to reduce the gap between efficient but unsafe systems like Baron 1, and safe but slow global optimization approaches. We show how constraint programming filtering techniques can be used to implement optimalitybased reduction in a safe and efficient way, and thus to take advantage of the known bounds of the objective function to reduce the domain of the variables, and to speed up the search of a global optimum. We describe an efficient strategy to compute very accurate approximations of feasible points. This strategy takes advantage of the Newton method for underconstrained systems of equalities and inequalities to compute efficiently a promising upper bound. Experiments on the COCONUT benchmarks demonstrate that these different techniques drastically improve the performances. ∗This paper is an extended version of [17]. Preliminary results have been published in [10] and [5].
Global Nonlinear Programming with possible infeasibility and finite termination
, 2012
"... In a recent paper, Birgin, Floudas and Martínez introduced an augmented Lagrangian method for global optimization. In their approach, augmented Lagrangian subproblems are solved using the αBB method and convergence to global minimizers was obtained assuming feasibility of the original problem. In th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In a recent paper, Birgin, Floudas and Martínez introduced an augmented Lagrangian method for global optimization. In their approach, augmented Lagrangian subproblems are solved using the αBB method and convergence to global minimizers was obtained assuming feasibility of the original problem. In the present research, the algorithm mentioned above will be improved in several crucial aspects. On the one hand, feasibility of the problem will not be required. Possible infeasibility will be detected in finite time by the new algorithms and optimal infeasibility results will be proved. On the other hand, finite termination results thatguaranteeoptimalityand/orfeasibilityuptoanyrequiredprecisionwillbeprovided. An adaptive modification in which subproblem tolerances depend on current feasibility and complementarity will also be given. The adaptive algorithm allows the augmented Lagrangian subproblems to be solved without requiring unnecessary potentially high precisions in the intermediate steps of the method, which improves the overall efficiency. Experiments showing how the new algorithms and results are related to practical computations will be given.
A modeling system for mathematics
"... This project aims at the development of a flexible modeling system for the specification of models for largescale numerical work in optimization, data analysis, and partial differential equations. Its input should be provided in a form natural for the working mathematician, while the choice of the ..."
Abstract
 Add to MetaCart
This project aims at the development of a flexible modeling system for the specification of models for largescale numerical work in optimization, data analysis, and partial differential equations. Its input should be provided in a form natural for the working mathematician, while the choice of the numerical solvers and the transformation to the format required by the solvers is done by the interface system. The input format should combine the simplicity of LaTeX source code with the semantic conciseness and modularity of current modeling languages such as AMPL, and it should be as close as possible to the mathematical language people use to explain and communicate their models in publications and lectures. In order that the system is useful for the intended applications, interfaces translating the model formulated in the proposed system into the input required for current state of the art solvers, and into the dominant current modeling languages are needed and shall be provided. Moreover, certain shortcomings of the current generation of modeling languages, such as the lack of support for the correct treatment of uncertainties and rounding errors, shall be overcome. The experience gained in this project will be useful in future work in the more general context