Results 1  10
of
12
A Global Optimization Method, αBB, for General TwiceDifferentiable Constrained NLPs: I  Theoretical Advances
, 1997
"... In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the constru ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
In this paper, the deterministic global optimization algorithm, αBB, (αbased Branch and Bound) is presented. This algorithm offers mathematical guarantees for convergence to a point arbitrarily close to the global minimum for the large class of twicedifferentiable NLPs. The key idea is the construction of a converging sequence of upper and lower bounds on the global minimum through the convex relaxation of the original problem. This relaxation is obtained by (i) replacing all nonconvex terms of special structure (i.e., bilinear, trilinear, fractional, fractional trilinear, univariate concave) with customized tight convex lower bounding functions and (ii) by utilizing some α parameters as defined by Maranas and Floudas (1994b) to generate valid convex underestimators for nonconvex terms of generic structure. In most cases, the calculation of appropriate values for the α parameters is a challenging task. A number of approaches are proposed, which rigorously generate a set of α par...
Optimal design of a CMOS opamp via geometric programming
 IEEE Transactions on ComputerAided Design
, 2001
"... We describe a new method for determining component values and transistor dimensions for CMOS operational ampli ers (opamps). We observe that a wide variety of design objectives and constraints have a special form, i.e., they are posynomial functions of the design variables. As a result the ampli er ..."
Abstract

Cited by 51 (10 self)
 Add to MetaCart
We describe a new method for determining component values and transistor dimensions for CMOS operational ampli ers (opamps). We observe that a wide variety of design objectives and constraints have a special form, i.e., they are posynomial functions of the design variables. As a result the ampli er design problem can be expressed as a special form of optimization problem called geometric programming, for which very e cient global optimization methods have been developed. As a consequence we can e ciently determine globally optimal ampli er designs, or globally optimal tradeo s among competing performance measures such aspower, openloop gain, and bandwidth. Our method therefore yields completely automated synthesis of (globally) optimal CMOS ampli ers, directly from speci cations. In this paper we apply this method to a speci c, widely used operational ampli er architecture, showing in detail how to formulate the design problem as a geometric program. We compute globally optimal tradeo curves relating performance measures such as power dissipation, unitygain bandwidth, and openloop gain. We show how the method can be used to synthesize robust designs, i.e., designs guaranteed to meet the speci cations for a
Global minimization using an Augmented Lagrangian method with variable lowerlevel constraints
, 2007
"... A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global c ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the εkglobal minimization of the Augmented Lagrangian with simple constraints, where εk → ε. Global convergence to an εglobal minimizer of the original problem is proved. The subproblems are solved using the αBB method. Numerical experiments are presented.
MixedInteger Nonlinear Optimization in Process Synthesis
, 1998
"... The use of networks allows the representation of a variety of important engineering problems. The treatment of a particular class of network applications, the process synthesis problem, is exposed in this paper. Process Synthesis seeks to develop systematically process flowsheets that convert raw ma ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The use of networks allows the representation of a variety of important engineering problems. The treatment of a particular class of network applications, the process synthesis problem, is exposed in this paper. Process Synthesis seeks to develop systematically process flowsheets that convert raw materials into desired products. In recent years, the optimization approach to process synthesis has shown promise in tackling this challenge. It requires the development of a network of interconnected units, the process superstructure, that represents the alternative process flowsheets. The mathematical modeling of the superstructure has a mixed set of binary and continuous variables and results in a mixedinteger optimization model. Due to the nonlinearity of chemical models, these problems are generally classified as MixedInteger Nonlinear Programming (MINLP) problems. A number of local optimization algorithms, developed for the solution of this class of problems, are presented in this pap...
On a modified subgradient algorithm for dual problems via sharp augmented Lagrangian
 Journal of Global Optimization
, 2006
"... We study convergence properties of a modified subgradient algorithm, applied to the dual problem defined by the sharp augmented Lagrangian. The primal problem we consider is nonconvex and nondifferentiable, with equality constraints. We obtain primal and dual convergence results, as well as a condit ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We study convergence properties of a modified subgradient algorithm, applied to the dual problem defined by the sharp augmented Lagrangian. The primal problem we consider is nonconvex and nondifferentiable, with equality constraints. We obtain primal and dual convergence results, as well as a condition for existence of a dual solution. Using a practical selection of the stepsize parameters, we demonstrate the algorithm and its advantages on test problems, including an integer programming and an optimal control problem. Key words: Nonconvex programming; nonsmooth optimization; augmented Lagrangian; sharp Lagrangian; subgradient optimization.
Optimization Framework for the Synthesis of Chemical Reactor Networks
, 1998
"... The reactor network synthesis problem involves determining the type, size, and interconnections of the reactor units, optimal concentration and temperature profiles, and the heat load requirements of the process. A general framework is presented for the synthesis of optimal chemical reactor networks ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The reactor network synthesis problem involves determining the type, size, and interconnections of the reactor units, optimal concentration and temperature profiles, and the heat load requirements of the process. A general framework is presented for the synthesis of optimal chemical reactor networks via an optimization approach. The possible design alternatives are represented via a process superstructure which includes continuous stirred tank reactors and cross flow reactors along with mixers and splitters that connect the units. The superstructure is mathematically modeled using differential and algebraic constraints and the resulting problem is formulated as an optimal control problem. The solution methodology for addressing the optimal control formulation involves the application of a control parameterization approach where the selected control variables are discretized in terms of time invariant parameters. The dynamic system is decoupled from the optimization and solved as a func...
Global Nonlinear Programming with possible infeasibility and finite termination
, 2012
"... In a recent paper, Birgin, Floudas and Martínez introduced an augmented Lagrangian method for global optimization. In their approach, augmented Lagrangian subproblems are solved using the αBB method and convergence to global minimizers was obtained assuming feasibility of the original problem. In th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In a recent paper, Birgin, Floudas and Martínez introduced an augmented Lagrangian method for global optimization. In their approach, augmented Lagrangian subproblems are solved using the αBB method and convergence to global minimizers was obtained assuming feasibility of the original problem. In the present research, the algorithm mentioned above will be improved in several crucial aspects. On the one hand, feasibility of the problem will not be required. Possible infeasibility will be detected in finite time by the new algorithms and optimal infeasibility results will be proved. On the other hand, finite termination results thatguaranteeoptimalityand/orfeasibilityuptoanyrequiredprecisionwillbeprovided. An adaptive modification in which subproblem tolerances depend on current feasibility and complementarity will also be given. The adaptive algorithm allows the augmented Lagrangian subproblems to be solved without requiring unnecessary potentially high precisions in the intermediate steps of the method, which improves the overall efficiency. Experiments showing how the new algorithms and results are related to practical computations will be given.
An Inexact Modified Subgradient Algorithm for Nonconvex Optimization ∗
, 2008
"... We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence properties of the MSG algorithm are preserved for the IMSG algorithm. Inexact minimization may allow to solve problems with less computational effort. We illustrate this through test problems, including an optimal bang–bang control problem, under several different inexactness schemes.
Interaction of Design and Control: Optimization with Dynamic Models
, 1997
"... Process design is usually approached by considering the steadystate performance of the process based on an economic objective. Only after the process design is determined are the operability aspects of the process considered. This sequential treatment of the process design problem neglects the fact ..."
Abstract
 Add to MetaCart
Process design is usually approached by considering the steadystate performance of the process based on an economic objective. Only after the process design is determined are the operability aspects of the process considered. This sequential treatment of the process design problem neglects the fact that the dynamic controllability of the process is an inherent property of its design. This work considers a systematic approach where the interaction between the steadystate design and the dynamic controllability is analyzed by simultaneously considering both economic and controllability criteria. This method follows a process synthesis approach where a process superstructure is used to represent the set of structural alternatives. This superstructure is modeled mathematically by a set of differential and algebraic equations which contains both continuous and integer variables. Two objectives representing the steadystate design and dynamic controllability of the process are considered. T...
Global Optimization with NonAnalytical Constraints
"... This paper presents an approach for the global optimization of constrained nonlinear programming problems in which some of the constraints are nonanalytical (nonfactorable), defined by a computational model for which no explicit analytical representation is available. ..."
Abstract
 Add to MetaCart
This paper presents an approach for the global optimization of constrained nonlinear programming problems in which some of the constraints are nonanalytical (nonfactorable), defined by a computational model for which no explicit analytical representation is available.