Results 1 
8 of
8
GloptLab, a configurable framework for the rigorous global solution of quadratic constraint satisfaction problems
"... solution of quadratic constraint satisfaction problems ..."
Abstract

Cited by 12 (8 self)
 Add to MetaCart
(Show Context)
solution of quadratic constraint satisfaction problems
Improving interval enclosures
, 2009
"... This paper serves as background information for the Vienna proposal for interval standardization, explaining what is needed in practice to make competent use of the interval arithmetic provided by an implementation of the standard to be. Discussed are methods to improve the quality of interval encl ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
This paper serves as background information for the Vienna proposal for interval standardization, explaining what is needed in practice to make competent use of the interval arithmetic provided by an implementation of the standard to be. Discussed are methods to improve the quality of interval enclosures of the range of a function over a box, considerations of possible hardware support facilitating the implementation of such methods, and the results of a simple interval challenge that I had posed to the reliable computing mailing list on November 26, 2008. Also given is an example of a bound constrained global optimization problem in 4 variables that has a 2dimensional continuum of global minimizers. This makes standard branch and bound codes extremely slow, and therefore may serve as a useful degenerate test problem.
Computerassisted proofs
 In IEEE SCAN 2006 proceedings
, 2007
"... This paper discusses the problem what makes a computerassisted proof trustworthy, the quest for an algorithmic support system for computerassisted proof, relations to global optimization, an analysis of some recent proofs, and some current challenges which appear to be amenable to a computerassis ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
This paper discusses the problem what makes a computerassisted proof trustworthy, the quest for an algorithmic support system for computerassisted proof, relations to global optimization, an analysis of some recent proofs, and some current challenges which appear to be amenable to a computerassisted treatment. 1
Noname manuscript No. (will be inserted by the editor) Necessary Global Optimality Conditions for Nonlinear Programming Problems with Polynomial Constraints
"... Abstract In this paper, we develop necessary conditions for global optimality that apply to nonlinear programming problems with polynomial constraints which cover a broad range of optimization problems that arise in applications of continuous as well as discrete optimization. In particular, we show ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract In this paper, we develop necessary conditions for global optimality that apply to nonlinear programming problems with polynomial constraints which cover a broad range of optimization problems that arise in applications of continuous as well as discrete optimization. In particular, we show that our optimality conditions readily apply to problems where the objective function is the difference of polynomial and convex functions over polynomial constraints, and to classes of fractional programming problems. Our necessary conditions become also sufficient for global optimality for polynomial programming problems. Our approach makes use of polynomial overestimators and a powerful theorem of the alternative for a system of polynomials from real algebraic geometry. We discuss numerical examples to illustrate the significance of our optimality conditions.
unknown title
"... Differential conditions for constrained nonlinear programming via Pareto optimization by ..."
Abstract
 Add to MetaCart
(Show Context)
Differential conditions for constrained nonlinear programming via Pareto optimization by
First Author Secondary Information: Order of Authors: Ferenc Domes
"... Abstract: In rigorous constrained global optimization, upper bounds on the objective function help to reduce the search space. Their construction requires finding a narrow box around an approximately feasible solution, verified to contain a feasible point. Approximations are easily found by local op ..."
Abstract
 Add to MetaCart
Abstract: In rigorous constrained global optimization, upper bounds on the objective function help to reduce the search space. Their construction requires finding a narrow box around an approximately feasible solution, verified to contain a feasible point. Approximations are easily found by local optimization, but the verification often fails. In this paper we show that even if the verification of an approximate feasible point fails, the information extracted from the local optimization can still be used in many cases to reduce the search space. This is done by a rigorous filtering technique called constraint aggregation. It forms an aggregated redundant constraint, based on approximate Lagrange multipliers or on a vector valued measure of constraint violation. Using the optimality conditions, two sided linear relaxations, the GaussJordan algorithm and a directed modified Cholesky factorization, the information in the redundant constraint is turned into powerful bounds on the feasible set. Constraint aggregation is especially useful since it also works in a tiny neighborhood of the global optimizer, thereby reducing the cluster effect.
Constraint aggregation for rigorous global optimization
"... In rigorous constrained global optimization, upper bounds on the objective function help to reduce the search space. Their construction requires finding a narrow box around an approximately feasible solution, verified to contain a feasible point. Approximations are easily found by local optimizati ..."
Abstract
 Add to MetaCart
In rigorous constrained global optimization, upper bounds on the objective function help to reduce the search space. Their construction requires finding a narrow box around an approximately feasible solution, verified to contain a feasible point. Approximations are easily found by local optimization, but the verification often fails. In this paper we show that even if the verification of an approximate feasible point fails, the information extracted from the local optimization can still be used in many cases to reduce the search space. This is done by a rigorous filtering technique called constraint aggregation. It forms an aggregated redundant constraint, based on approximate Lagrange multipliers or on a vector valued measure of constraint violation. Using the optimality conditions, two sided linear relaxations, the GaussJordan algorithm and a directed modified Cholesky factorization, the information in the redundant constraint is turned into powerful bounds on the feasible set. Constraint aggregation is especially useful since it also works in a tiny neighborhood of the global optimizer, thereby reducing the cluster effect. A simple