Results 1  10
of
10
Complete search in continuous global optimization and constraint satisfaction, Acta Numerica 13
, 2004
"... A chapter for ..."
Interval Analysis on Directed Acyclic Graphs for Global Optimization
 J. Global Optimization
, 2004
"... A directed acyclic graph (DAG) representation of optimization problems represents each variable, each operation, and each constraint in the problem formulation by a node of the DAG, with edges representing the ow of the computation. ..."
Abstract

Cited by 39 (8 self)
 Add to MetaCart
A directed acyclic graph (DAG) representation of optimization problems represents each variable, each operation, and each constraint in the problem formulation by a node of the DAG, with edges representing the ow of the computation.
Snobfit  Stable Noisy Optimization by Branch and Fit
"... this paper produces a userspeci ed number of suggested evaluation points in each step; proceeds by successive partitioning of the box (branch) and building local quadratic models ( t); combines local and global search and allows the user to determine which of both should be emphasized; h ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
this paper produces a userspeci ed number of suggested evaluation points in each step; proceeds by successive partitioning of the box (branch) and building local quadratic models ( t); combines local and global search and allows the user to determine which of both should be emphasized; handles local search from the best point with the aid of trust regions; allows for hidden constraints and assigns to such points a function value based on the function values of nearby feasible points
Comparing Partial Consistencies
, 1999
"... Global search algorithms have been widely used in the constraint programming framework to solve constraint systems over continuous domains. This paper precisely states the relations among the different partial consistencies which are main emphasis of these algorithms. The ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
Global search algorithms have been widely used in the constraint programming framework to solve constraint systems over continuous domains. This paper precisely states the relations among the different partial consistencies which are main emphasis of these algorithms. The
Taylor Forms  Use and Limits
 Reliable Computing
, 2002
"... This review is a response to recent discussions on the reliable computing mailing list, and to continuing uncertainties about the properties and merits of Taylor forms, multivariate higher degree generalizations of centered forms. They were invented around 1980 by Lanford, documented in detail in 19 ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This review is a response to recent discussions on the reliable computing mailing list, and to continuing uncertainties about the properties and merits of Taylor forms, multivariate higher degree generalizations of centered forms. They were invented around 1980 by Lanford, documented in detail in 1984 by Eckmann, Koch and Wittwer, and independently studied and popularized since 1996 by Berz, Makino and Hoefkens. A highlight is their application to the verified integration of asteroid dynamics in the solar system in 2001, although the details given are not sufficient to check the validity of their claims.
Constraint propagation on quadratic constraints
, 2008
"... This paper considers constraint propagation methods for continuous constraint satisfaction problems consisting of linear and quadratic constraints. All methods can be applied after suitable preprocessing to arbitrary algebraic constraints. The basic new techniques consist in eliminating bilinear en ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
This paper considers constraint propagation methods for continuous constraint satisfaction problems consisting of linear and quadratic constraints. All methods can be applied after suitable preprocessing to arbitrary algebraic constraints. The basic new techniques consist in eliminating bilinear entries from a quadratic constraint, and solving the resulting separable quadratic constraints by means of a sequence of univariate quadratic problems. Care is taken to ensure that all methods correctly account for rounding errors in the computations.
An interval partitioning approach for continuous constrained optimization
 Models and Algorithms in Global Optimization
, 2006
"... Constrained Optimization Problems (COP’s) are encountered in many scientific fields concerned with industrial applications such as kinematics, chemical process optimization, molecular design, etc. When nonlinear relationships among variables are defined by problem constraints resulting in nonconv ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Constrained Optimization Problems (COP’s) are encountered in many scientific fields concerned with industrial applications such as kinematics, chemical process optimization, molecular design, etc. When nonlinear relationships among variables are defined by problem constraints resulting in nonconvex feasible sets, the problem of identifying feasible solutions may become very hard. Consequently, finding the location of the global optimum in the COP is more difficult as compared to boundconstrained global optimization problems. This chapter proposes a new interval partitioning method for solving the COP. The proposed approach involves a new subdivision direction selection method as well as an adaptive search tree framework where nodes (boxes defining different variable domains) are explored using a restricted hybrid depthfirst and bestfirst branching strategy. This hybrid approach is also used for activating local search in boxes with the aim of identifying different feasible stationary points. The proposed search tree management approach improves the convergence speed of the interval partitioning method that is also supported by the new parallel subdivision direction selection rule
The NOP2 modeling language
, 1998
"... This paper describes the modeling language NOP2 for specifying general optimization problems, including constrained local or global nonlinear programs and constrained stochastic single and multistage programs. The proposed language is specifically designed to represent the internal (partially separ ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper describes the modeling language NOP2 for specifying general optimization problems, including constrained local or global nonlinear programs and constrained stochastic single and multistage programs. The proposed language is specifically designed to represent the internal (partially separable, repetitive, or sparse) structure of the problem. Thus it enables the model builder to somewhat step back from the pure meaning of the problem to focus on the mathematical content and to rewrite it in a form that is more adapted to global and large scale optimization algorithms that can exploit such structure. Models described in NOP2 do not depend on the computer architecture. Therefore, models developed on personal computers can later be solved on high speed workstations or supercomputers. The language also has features for checking correct specification of NOP2 files. In contrast to the SIF input format (cf Section 4.1 below) proposed by Conn, Gould, and Toint [3] for their LANCELOT package, the amount of overhead in the formulation of smaller problems is very small: for example, Rosenbrock's function (in SIF the description takes almost a page) can be represented in a few lines in such a way that the least squares structure is visible in the representation. Together with planned and partly finished interfaces to the modeling languages AMPL (Fourer, Gay & Kernighan [6]) and GAMS (Brooke, Kendrick, Meeraus [1]), to our new global optimization code GLOPT [4], and to the optimization package MINOS (Murtagh & Saunders [11]), this is a promising tool for the formulation and solution of various types of optimization problems. Each NOP2 file consists of a sequence of records describing an optimization problem of one of the following forms. (i) Nonlinear programs. min f(x)
NOP  a compact input format for nonlinear optimization problems
, 1996
"... . This paper defines a compact format for specifying general constrained nonlinear optimization problems. The proposed format is a nonlinear analogue of an explicit representation of sparse matrices by means of index lists and values of the corresponding matrix entries. Thus the format abstracts fro ..."
Abstract
 Add to MetaCart
. This paper defines a compact format for specifying general constrained nonlinear optimization problems. The proposed format is a nonlinear analogue of an explicit representation of sparse matrices by means of index lists and values of the corresponding matrix entries. Thus the format abstracts from the meaning of the problem and hence does not allow names for variables or parameters, but it explicitly displays the internal structure of the problem. This is a very useful feature for global or large scale local optimization. Key words: largescale optimization, global optimization, nonlinear programming, test problems, input format 1991 MSC codes: 90C30 1. Introduction This paper defines a compact format NOP for specifying general constrained nonlinear optimization problems. The proposed format is a nonlinear analogue of an explicit representation of sparse matrices by means of index lists and values of the corresponding matrix entries. Thus the format abstracts from the meaning of ...