Results 1  10
of
22
Complete search in continuous global optimization and constraint satisfaction, Acta Numerica 13
, 2004
"... A chapter for ..."
A Global Optimization Algorithm (GOP) for Certain Classes of Nonconvex NLPs : II. Application of Theory and Test Problems
 Engng
, 1990
"... In Part I (Floudas and Visweswaran, 1990), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the rigorous solution of the problem through a series of primal and relaxed dual problems until th ..."
Abstract

Cited by 54 (21 self)
 Add to MetaCart
In Part I (Floudas and Visweswaran, 1990), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the rigorous solution of the problem through a series of primal and relaxed dual problems until the upper and lower bounds from these problems converged to an fflglobal optimum. In this paper, theoretical results are presented for several classes of mathematical programming problems that include : (i) the general quadratic programming problem, (ii) quadratic programming problems with quadratic constraints, (iii) pooling and blending problems, and (iv) unconstrained and constrained optimization problems with polynomial terms in the objective function and/or constraints. For each class, a few examples are presented illustrating the approach. Keywords : Global Optimization, Quadratic Programming, Quadratic Constraints, Polynomial functions, Pooling and Blending Problems. Author to whom...
A PrimalRelaxed Dual Global Optimization Approach
, 1993
"... A deterministic global optimization approach is proposed for nonconvex constrained nonlinear programming problems. Partitioning of the variables, along with the introduction of transformation variables, if necessary, convert the original problem into primal and relaxed dual subproblems that provide ..."
Abstract

Cited by 41 (19 self)
 Add to MetaCart
A deterministic global optimization approach is proposed for nonconvex constrained nonlinear programming problems. Partitioning of the variables, along with the introduction of transformation variables, if necessary, convert the original problem into primal and relaxed dual subproblems that provide valid upper and lower bounds respectively on the global optimum. Theoretical properties are presented which allow for a rigorous solution of the relaxed dual problem. Proofs of fflfinite convergence and fflglobal optimality are provided. The approach is shown to be particularly suited to (a) quadratic programming problems, (b) quadratically constrained problems, and (c) unconstrained and constrained optimization of polynomial and rational polynomial functions. The theoretical approach is illustrated through a few example problems. Finally, some further developments in the approach are briefly discussed.
Interval Analysis on Directed Acyclic Graphs for Global Optimization
 J. Global Optimization
, 2004
"... A directed acyclic graph (DAG) representation of optimization problems represents each variable, each operation, and each constraint in the problem formulation by a node of the DAG, with edges representing the ow of the computation. ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
A directed acyclic graph (DAG) representation of optimization problems represents each variable, each operation, and each constraint in the problem formulation by a node of the DAG, with edges representing the ow of the computation.
Rigorous Convex Underestimators for General TwiceDifferentiable Problems
 Journal of Global Optimization
, 1996
"... . In order to generate valid convex lower bounding problems for nonconvex twicedifferentiable optimization problems, a method that is based on second order information of general twicedifferentiable functions is presented. Using interval Hessian matrices, valid lower bounds on the eigenvalues ..."
Abstract

Cited by 35 (15 self)
 Add to MetaCart
. In order to generate valid convex lower bounding problems for nonconvex twicedifferentiable optimization problems, a method that is based on second order information of general twicedifferentiable functions is presented. Using interval Hessian matrices, valid lower bounds on the eigenvalues of such functions are obtained and used in constructing convex underestimators. By solving several nonlinear example problems, it is shown that the lower bounds are sufficiently tight to ensure satisfactory convergence of the ffBB, a branch and bound algorithm which relies on this underestimation procedure [3]. Key words: convex underestimators; twicedifferentiable; interval anlysis; eigenvalues 1. Introduction The mathematical description of many physical phenomena, such as phase equilibrium, or of chemical processes generally requires the introduction of nonconvex functions. As the number of local solutions to a nonconvex optimization problem cannot be predicted a priori, the identifi...
New Properties and Computational Improvement of the GOP Algorithm For Problems With Quadratic Objective Function and Constraints
 Journal of Global Optimization
, 1993
"... In Floudas and Visweswaran (1990, 1992), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the solution of the problem through a series of primal and relaxed dual problems that provide valid ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
In Floudas and Visweswaran (1990, 1992), a deterministic global optimization approach was proposed for solving certain classes of nonconvex optimization problems. An algorithm, GOP, was presented for the solution of the problem through a series of primal and relaxed dual problems that provide valid upper and lower bounds respectively on the global solution. The algorithm was proved to have finite convergence to an fflglobal optimum. In this paper, new theoretical properties are presented that help to enhance the computational performance of the GOP algorithm applied to problems of special structure. The effect of the new properties is illustrated through application of the GOP algorithm to a difficult indefinite quadratic problem, a multiperiod tankage quality problem that occurs frequently in the modeling of refinery processes, and a set of pooling/blending problems from the literature. In addition, extensive computational experience is reported for randomly generated concave and in...
Crestfactor minimization using nonlinear Chebyshev approximation methods
 IEEE Trans. on Inst. and Meas
, 1991
"... AbstractLow crestfactor of excitation and response signals is desirable in transfer function measurements, since this allows the maximization of the signaltonoise ratios (SNR’s) for given allowable amplitude ranges of the signals. The paper presents a new crestfactor minimization algorithm for ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
AbstractLow crestfactor of excitation and response signals is desirable in transfer function measurements, since this allows the maximization of the signaltonoise ratios (SNR’s) for given allowable amplitude ranges of the signals. The paper presents a new crestfactor minimization algorithm for periodic signals with prescribed power spectrum. The algorithm is based on approximation of the nondifferentiable Chebyshev (minimax) norm by Z,,norms with increasing values of p, and the calculations are accelerated by using FFT’s. Several signals related by linear systems can also be compressed simultaneously. The resulting crestfactors are significantly better than those provided by earlier methods. Moreover, it is shown that the peak value of a signal can be further decreased by allowing some extra energy at additional frequencies. KeywordsCrestfactor, multisine, optimal excitation. I.
GLOPT  A Program for Constrained Global Optimization
 Developments in Global Optimization
, 1996
"... . GLOPT is a Fortran77 program for global minimization of a blockseparable objective function subject to bound constraints and blockseparable constraints. It finds a nearly globally optimal point that is near a true local minimizer. Unless there are several local minimizers that are nearly global, ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
. GLOPT is a Fortran77 program for global minimization of a blockseparable objective function subject to bound constraints and blockseparable constraints. It finds a nearly globally optimal point that is near a true local minimizer. Unless there are several local minimizers that are nearly global, we thus find a good approximation to the global minimizer. GLOPT uses a branch and bound technique to split the problem recursively into subproblems that are either eliminated or reduced in their size. This is done by an extensive use of the block separable structure of the optimization problem. In this paper we discuss a new reduction technique for boxes and new ways for generating feasible points of constrained nonlinear programs. These are implemented as the first stage of our GLOPT project. The current implementation of GLOPT uses neither derivatives nor simultaneous information about several constraints. Numerical results are already encouraging. Work on an extension using curvature inf...