Results 1  10
of
40
Global optimization by multilevel coordinate search
 J. Global Optimization
, 1999
"... Abstract. Inspired by a method by Jones et al. (1993), we present a global optimization algorithm based on multilevel coordinate search. It is guaranteed to converge if the function is continuous in the neighborhood of a global minimizer. By starting a local search from certain good points, an impro ..."
Abstract

Cited by 73 (11 self)
 Add to MetaCart
Abstract. Inspired by a method by Jones et al. (1993), we present a global optimization algorithm based on multilevel coordinate search. It is guaranteed to converge if the function is continuous in the neighborhood of a global minimizer. By starting a local search from certain good points, an improved convergence result is obtained. We discuss implementation details and give some numerical results.
Complete search in continuous global optimization and constraint satisfaction, Acta Numerica 13
, 2004
"... A chapter for ..."
On the Selection of Subdivision Directions in Interval BranchandBound Methods for Global Optimization
 J. Global Optimization
, 1995
"... . This paper investigates the influence of the interval subdivision selection rule on the convergence of interval branchandbound algorithms for global optimization. For the class of rules that allows convergence, we study the effects of the rules on a model algorithm with special list ordering. Fo ..."
Abstract

Cited by 30 (13 self)
 Add to MetaCart
. This paper investigates the influence of the interval subdivision selection rule on the convergence of interval branchandbound algorithms for global optimization. For the class of rules that allows convergence, we study the effects of the rules on a model algorithm with special list ordering. Four different rules are investigated in theory and in practice. A wide spectrum of test problems is used for numerical tests indicating that there are substantial differences between the rules with respect to the required CPU time, the number of function and derivative evaluations, and the necessary storage space. Two rules can provide considerable improvements in efficiency for our model algorithm. Keywords: Global optimization, interval arithmetic, branchandbound, interval subdivision 1. Introduction The investigated class of interval branchandbound methods for global optimization [7], [8], [19] addresses the problem of finding guaranteed and reliable solutions of global optimization...
A Review Of Techniques In The Verified Solution Of Constrained Global Optimization Problems
, 1996
"... Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previousl ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
Elements and techniques of stateoftheart automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for equality constraints and a careful consideration of the role of active inequality constraints. Previously developed algorithms and general work on the subject are also listed. Limitations of present knowledge are mentioned, and advice is given on which techniques to use in various contexts. Applications are discussed. 1 INTRODUCTION, BASIC IDEAS AND LITERATURE We consider the constrained global optimization problem minimize OE(X) subject to c i (X) = 0; i = 1; : : : ; m (1.1) a i j x i j b i j ; j = 1; : : : ; q; where X = (x 1 ; : : : ; xn ) T . A general constrained optimization problem, including inequality constraints g(X) 0 can be put into this form by introducing slack variables s, replacing by s + g(X) = 0, and appending the bound constraint 0 s ! 1; see x2.2. 2 Chapter 1 W...
Efficient solving of quantified inequality constraints over the real numbers
 ACM Transactions on Computational Logic
"... Let a quantified inequality constraint over the reals be a formula in the firstorder predicate language over the structure of the real numbers, where the allowed predicate symbols are ≤ and <. Solving such constraints is an undecidable problem when allowing function symbols such sin or cos. In the ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
Let a quantified inequality constraint over the reals be a formula in the firstorder predicate language over the structure of the real numbers, where the allowed predicate symbols are ≤ and <. Solving such constraints is an undecidable problem when allowing function symbols such sin or cos. In the paper we give an algorithm that terminates with a solution for all, except for very special, pathological inputs. We ensure the practical efficiency of this algorithm by employing constraint programming techniques. 1
Global Optimization in Parameter Estimation of Nonlinear Algebraic Models via the ErrorInVariables Approach
 Ind. Eng. Chem. Res
, 1998
"... The estimation of parameters in nonlinear algebraic models through the errorinvariables method has been widely studied from a computational standpoint. The method involves the minimization of a weighted sum of squared errors subject to the model equations. Due to the nonlinear nature of the models ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
The estimation of parameters in nonlinear algebraic models through the errorinvariables method has been widely studied from a computational standpoint. The method involves the minimization of a weighted sum of squared errors subject to the model equations. Due to the nonlinear nature of the models used, the resulting formulation is nonconvex, and may contain several local minima in the region of interest. Current methods tailored for this formulation, although computationally efficient, can only attain convergence to a local solution. In this paper, a global optimization approach based on a branch and bound framework and convexification techniques for general twice differentiable nonlinear optimization problems is proposed for the parameter estimation of nonlinear algebraic models. The proposed convexification techniques exploit the mathematical properties of the formulation. Classical nonlinear estimation problems were solved and will be used to illustrate the various theoretical an...
GLOPT  A Program for Constrained Global Optimization
 Developments in Global Optimization
, 1996
"... . GLOPT is a Fortran77 program for global minimization of a blockseparable objective function subject to bound constraints and blockseparable constraints. It finds a nearly globally optimal point that is near a true local minimizer. Unless there are several local minimizers that are nearly global, ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
. GLOPT is a Fortran77 program for global minimization of a blockseparable objective function subject to bound constraints and blockseparable constraints. It finds a nearly globally optimal point that is near a true local minimizer. Unless there are several local minimizers that are nearly global, we thus find a good approximation to the global minimizer. GLOPT uses a branch and bound technique to split the problem recursively into subproblems that are either eliminated or reduced in their size. This is done by an extensive use of the block separable structure of the optimization problem. In this paper we discuss a new reduction technique for boxes and new ways for generating feasible points of constrained nonlinear programs. These are implemented as the first stage of our GLOPT project. The current implementation of GLOPT uses neither derivatives nor simultaneous information about several constraints. Numerical results are already encouraging. Work on an extension using curvature inf...
A new multisection technique in interval methods for global optimization
 Computing
, 2000
"... A new multisection technique in interval methods for global optimization is investigated, and numerical tests demonstrate that the efficiency of the underlying global optimization method can be improved substantially. The heuristic rule is based on experiences that suggest the subdivision of the c ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
A new multisection technique in interval methods for global optimization is investigated, and numerical tests demonstrate that the efficiency of the underlying global optimization method can be improved substantially. The heuristic rule is based on experiences that suggest the subdivision of the current subinterval into a larger number of pieces only if it is located in the neighbourhood of a minimizer point. An estimator of the proximity of a subinterval to the region of attraction to a minimizer point is utilized. According to the numerical study made, the new multisection strategies seem to be indispensable, and can improve both the computational and the memory complexity substantially.
A New Verified Optimization Technique For The "Packing Circles In A Unit Square" Problems
 SIAM Journal on Optimization
, 1994
"... The paper presents a new verified optimization method for the problem of finding the densest packing of nonoverlapping equal circles within a square. In order to provide reliable numerical results, the developed algorithm is based on interval analysis. As one of the most efficient parts of the algo ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
The paper presents a new verified optimization method for the problem of finding the densest packing of nonoverlapping equal circles within a square. In order to provide reliable numerical results, the developed algorithm is based on interval analysis. As one of the most efficient parts of the algorithm, an intervalbased version of a previous elimination procedure is introduced. This method represents the remaining areas still of interest as polygons fully calculated in a reliable way. The most promising strategy of finding optimal circle packing configurations is currently the partitioning of the original problem into subproblems. Still as a result of the highly increasing number of subproblems, earlier computeraided methods were not able to solve problem instances where the number of circles was greater than 27. The present paper provides a carefully developed technique resolving this difficulty by eliminating large groups of subproblems together. As a demonstration of the capabilities of the new algorithm the problems of packing 28, 29, and 30 circles were solved within very tight tolerance values. Our verified procedure decreased the uncertainty in the location of the optimal packings by more than 700 orders of magnitude in all cases.
A Heuristic Rejection Criterion in Interval Global Optimization Algorithms
, 1999
"... This paper investigates the properties of the inclusion functions on subintervals while a BranchandBound algorithm is solving global optimization problems. It has been found that the relative place of the global minimum value within the inclusion interval of the inclusion function of the objective ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
This paper investigates the properties of the inclusion functions on subintervals while a BranchandBound algorithm is solving global optimization problems. It has been found that the relative place of the global minimum value within the inclusion interval of the inclusion function of the objective function at the actual interval mostly indicates whether the given interval is close to a minimizer point. This information is used in a heuristic interval rejection rule that can save a big amount of computation. Illustrative examples are discussed and a numerical study completes the investigation. AMS subject classication: 65K, 90C. Key words: Global optimization, BranchandBound Algorithm, Inclusion Function. 1