## Global Optimization For Constrained Nonlinear Programming (2001)

Citations: | 12 - 2 self |

### BibTeX

@MISC{Wang01globaloptimization,

author = {Tao Wang},

title = {Global Optimization For Constrained Nonlinear Programming},

year = {2001}

}

### Years of Citing Articles

### OpenURL

### Abstract

In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for constrained local minima (CLM dn ) in the theory of discrete constrained optimization using Lagrange multipliers developed in our group. The theory proves the equivalence between the set of discrete saddle points and the set of CLM dn, leading to the first-order necessary and sufficient condition for CLM dn. To find

### Citations

7342 |
J.H.: Genetic Algorithms and
- Goldberg, Holland
- 1988
(Show Context)
Citation Context ... methods focus too much on improving current solutions, they have small chances to overcome rugged search terrains and deep local minima and get stuck at local minima easily. Genetic algorithms (GAs) =-=[85, 71, 127, 167, 140, 147, 94]-=- maintain a population of points in every generation and uses some genetic operators, such as cross-overs and mutations, to generate new points. These old and new points compete for survival in the ne... |

3529 | Optimization by simulated annealing
- Gelatt, Vecchi
- 1983
(Show Context)
Citation Context ... constrainedsglobal optimization today and complements simulated annealing (SA) in nonlinear unconstrained global optimization. It is following, we first outline the di#erences between traditional SA =-=[1, 118]-=- and CSA developed in this thesis. 51 . Targeted problems: SA was developed for solving unconstrained NLPs, whereas CSA is for solving constrained NLPs. In addition to minimizing objective function f(... |

1966 |
Genetic Algorithms + Data Structure = Evolution Programs
- Michalewicz
- 1992
(Show Context)
Citation Context ...ence, reduce oscillations, and greatly speed up convergence. 13 Chapter 2 Previous Work Active research in the past four decades has produced a variety of methods for solving general constrained NLPs =-=[185, 106, 69, 91, 127]-=-. They fall into one of two general formulations, direct solution or transformation-based. The former aims to directly solve constrained NLP (1.1) by searching its feasible regions, while the latter f... |

1894 |
Numerical Optimization
- Nocedal, SJ
- 2000
(Show Context)
Citation Context ...i#erentiable or non-di#erentiable. In some applications, variables are restricted to take prespecified values. According to the values that variable x takes, we have three classes of constrained NLPs =-=[141]-=-: . Discrete problems: Variable x is a vector of discrete variables, where component x i takes discrete and finite values, such as integers. Although variable space X at this time is finite (because v... |

1579 | Orthonormal bases of compactly supported wavelets
- Daubechies
(Show Context)
Citation Context ...ing, filter bank design [188, 66, 103] tries to achieve PR condition and approximate an ideal filter on every frequency. But in wavelet theory, filter design emphasizes smoothness near # = 0 or # = # =-=[12, 17, 52, 47]-=-. Here, we formulate regularity as simple equations in (6.6) for 2-order regularity. 6.3.2.4 Coding Gain Coding gain measures energy compaction, and high coding gains correlate consistently with high ... |

1371 |
A simplex method for function minimization
- Nelder, Mead
- 1965
(Show Context)
Citation Context ...y from one point to another, making it di#cult for gradientbased methods to converge. It is also di#cult for these methods to get out of deep local minima after getting stuck in them. Simplex methods =-=[138, 155]-=- utilize a set of sample points to form a simplex, and iterate such a simplex by the reflection, expansion, and contraction operators until its volume is su#ciently small. Everytime, the worst point i... |

1248 |
Embedded image coding using zerotrees of wavelet coefficients
- Shapiro
- 1993
(Show Context)
Citation Context ... basic idea is to exploit the relationship of these coe#cients across subbands, for example, some significant coe#cients are similar in shape and location across di#erent subbands. Embedded zero-tree =-=[176]-=- combines this idea with the notion of coding zeros jointly by encoding and decoding an image progressively, whereas set partitioning in hierarchical tree (SPHIT) [168] further enhances its implementa... |

1217 |
Monte carlo sampling methods using markov chains and their applications
- Hastings
(Show Context)
Citation Context ...g to the Metropolis acceptance probability (3.11). Besides the Metropolis rule, we also evaluate three other possible acceptance rules studied in SA: Logistic acceptance rule [1, 164], Hastings' rule =-=[95, 78]-=-, and Tsallis' rule [8, 93]. All these acceptance rules lead to asymptotic convergence [163], although they di#er in solution quality when applied under a finite cooling schedule. Because CSA carries ... |

1040 |
Linear and Nonlinear Programming
- Luenberger
- 1984
(Show Context)
Citation Context ...hen I d # I c = {1, 2, , n}, and I d # I c = #. Variable space X is infinite because of continuous variables. 1.2 Basic Concepts To characterize the solutions sought, we introduce some basic concepts =-=[31, 125, 145, 69]-=- on neighborhoods, feasible solutions, and constrained local and global minima here. 3 Definition 1.1 N (x), the neighborhood of point x in variable space X, is a set of points x # # X such that x # #... |

882 | A new, fast and efficient image codec based on set partitioning in hierarchical trees - Said, Pearlman - 1996 |

871 | The JPEG Still Picture Compression Standard
- Wallace
- 1991
(Show Context)
Citation Context ...ransforms can be either linear or nonlinear, but are usually linear due to their invertibility. Two commonly-used linear transforms are the discrete cosine transform (DCT) and subband transform. JPEG =-=[207]-=-, as an image coding standard, partitions images into nonoverlapping blocks. These image blocks are then transformed by DCT independently. When the transform coe#cients are coded under low-bit-rate me... |

851 |
Multirate Systems and Filter Banks
- Vaidyanathan
- 1993
(Show Context)
Citation Context ...ed peak signal-to-noise ratio (PSNR) for compressed images. Because it is very time-consuming to directly compute this image-dependent PSNR, many heuristic objectives have been used in the literature =-=[12, 54, 66, 188]-=-. These include coding gain, frequency selectivity, perfect reconstruction (PR), linear phase (LP), and wavelet regularity. Hence, filter designs become multi-objective optimization problems. We study... |

814 | Ant system: optimization by a colony of cooperating agents
- Dorigo, Maniezzo, et al.
- 1996
(Show Context)
Citation Context ... points. Stochastic global-search approaches consist of random multistarts [171, 169, 96, 177], adaptive multi-starts [34], tabu search [82, 25], guided local search (GLS) [195], ant colony 22 system =-=[60, 59]-=-, population-based incremental learning [20] and its enhancements [36, 21, 22], as well as Bayesian methods [134, 196, 135]. Multi-start [171, 169, 96, 177] restarts a search by randomly generating ne... |

710 |
Image coding using wavelet transform
- Antonini, Barlaud, et al.
- 1992
(Show Context)
Citation Context ...age, image coding (or compression) has been a subject of great interest in both academia and industry. Subband/wavelet image coding has recently become a cutting edge technology for image compression =-=[12, 54, 97, 192]-=-. In this chapter, we address filter design issue in subband image coding, whose goal is to achieve a better objective measure called peak signal-to-noise ratio (PSNR) for compressed images. Because i... |

633 | Ant colony system: a cooperative learning approach to the travelling salesman problem
- Dorigo, Gambardella
- 1997
(Show Context)
Citation Context ... points. Stochastic global-search approaches consist of random multistarts [171, 169, 96, 177], adaptive multi-starts [34], tabu search [82, 25], guided local search (GLS) [195], ant colony 22 system =-=[60, 59]-=-, population-based incremental learning [20] and its enhancements [36, 21, 22], as well as Bayesian methods [134, 196, 135]. Multi-start [171, 169, 96, 177] restarts a search by randomly generating ne... |

601 | Tabu search
- Glover, Laguna
- 1993
(Show Context)
Citation Context ...ion about complicated search terrains and leads to poor starting points. Stochastic global-search approaches consist of random multistarts [171, 169, 96, 177], adaptive multi-starts [34], tabu search =-=[82, 25]-=-, guided local search (GLS) [195], ant colony 22 system [60, 59], population-based incremental learning [20] and its enhancements [36, 21, 22], as well as Bayesian methods [134, 196, 135]. Multi-start... |

496 |
Constrained Optimization and Lagrange Multiplier Method
- Bertsekas
- 1982
(Show Context)
Citation Context ...hen I d # I c = {1, 2, , n}, and I d # I c = #. Variable space X is infinite because of continuous variables. 1.2 Basic Concepts To characterize the solutions sought, we introduce some basic concepts =-=[31, 125, 145, 69]-=- on neighborhoods, feasible solutions, and constrained local and global minima here. 3 Definition 1.1 N (x), the neighborhood of point x in variable space X, is a set of points x # # X such that x # #... |

474 |
Wavelets and subband coding
- Vetterli, Kovačevic
- 1995
(Show Context)
Citation Context ...age, image coding (or compression) has been a subject of great interest in both academia and industry. Subband/wavelet image coding has recently become a cutting edge technology for image compression =-=[12, 54, 97, 192]-=-. In this chapter, we address filter design issue in subband image coding, whose goal is to achieve a better objective measure called peak signal-to-noise ratio (PSNR) for compressed images. Because i... |

465 |
Global optimization Using Interval Analysis
- HANSEN
- 1001
(Show Context)
Citation Context ...ence, reduce oscillations, and greatly speed up convergence. 13 Chapter 2 Previous Work Active research in the past four decades has produced a variety of methods for solving general constrained NLPs =-=[185, 106, 69, 91, 127]-=-. They fall into one of two general formulations, direct solution or transformation-based. The former aims to directly solve constrained NLP (1.1) by searching its feasible regions, while the latter f... |

465 | Primal-Dual Interior-Point Methods
- Wright
- 1997
(Show Context)
Citation Context ...ible points that may be as di#cult as solving the original problem when there are nonlinear constraints. They only work for inequality constraints but not equality constraints. Interior point methods =-=[214, 215, 141]-=- extend the barrier methods by keeping non-negative variables strictly positive while allowing the search outside feasible regions. This gives them freedoms to start and proceed in infeasible regions.... |

393 |
Random Perturbations of Dynamical Systems
- Freidlin, Wentzell
- 1984
(Show Context)
Citation Context ...where V (i, j) = (f(j) - f(i)) + used in SA, we have the local balance equation q(i, j)Q T (i, j) = q(j, i)Q T (j, i). In the following, we quote the notion of A-graph and virual energy as defined in =-=[72, 186, 187] Defi-=-nition 3.2 A-Graph "(Definition 2.4 in [72, 186, 187]). Let A # E. A set g of arrows i # j in A c E is an A-graph, where j # N (i), i# a) for each i # A c , there exists a unique j # E such that ... |

390 | Differential Evolution - A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces - Storn, Price - 1997 |

326 | An overview of evolutionary algorithms for parameter optimization - Back, Schwefel - 1993 |

315 |
E.," Multiple criteria optimization: Theory, computation, and application
- Steuer
(Show Context)
Citation Context ...tural optimization, engineering design, computer-aided-design (CAD) for VLSI, database design and processing, nuclear power plant design and operation, mechanical design, and chemical process control =-=[68, 145, 180]-=-. Due to the availability of a lot of unconstrained optimization algorithms, many real applications that are inherently nonlinear and constrained have been solved in various unconstrained forms. Optim... |

313 | A thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm - Čern´y - 1985 |

248 |
Global Optimization: Deterministic Approaches (3rd edition
- Horst, Tuy
- 2003
(Show Context)
Citation Context ...d carries its original meaning. However, one may also choose the neighborhood to contain "far away" points. For a continuous problem, neighborhood N cn (x) is well-defined and application in=-=dependent [31, 125, 106]-=-. It includes those points that are su#ciently close to x, i.e., N cn (x) is a set of points x # such that ||x # - x||sfor some small # > 0. For a mixed-integer problem, neighborhood Nmn (x) can be de... |

226 | A survey of evolution strategies
- Back, Hoffmeister, et al.
- 1991
(Show Context)
Citation Context ...le region may lead to poor solutions. In addition, it is very di#cult or expensive to project a trajectory into feasible regions for nonlinear constraints. Global Search. Rejecting/discarding methods =-=[110, 14, 160, 154]-=- are stochastic procedures. They iteratively generate random points and only accept feasible points, while dropping infeasible points during their search. Although they are simple and easy to implemen... |

225 | Schoenauer," Evolutionary Algorithms for Constrained Parameter Optimization Problems". Evolutionary computation 4(1)132,(1996
- Michalewicz
(Show Context)
Citation Context ...each of which has its own penalty values. This method is very problem-dependent and cannot be generalized to other optimization problems. Generation-based dynamic-penalties [112], annealing penalties =-=[130]-=- and adaptive penalties [27, 89, 151] can be viewed as approximate implementations of dynamic-penalty formulation (2.2). Although they di#er in their ways of modifying the penalties, all of them adjus... |

224 | The Reactive Tabu Search
- Battiti, Tecchiolli
- 1994
(Show Context)
Citation Context ...ion about complicated search terrains and leads to poor starting points. Stochastic global-search approaches consist of random multistarts [171, 169, 96, 177], adaptive multi-starts [34], tabu search =-=[82, 25]-=-, guided local search (GLS) [195], ant colony 22 system [60, 59], population-based incremental learning [20] and its enhancements [36, 21, 22], as well as Bayesian methods [134, 196, 135]. Multi-start... |

220 |
Partitioning Procedures for Solving Mixed-Variables Programming Problems
- Benders
- 1962
(Show Context)
Citation Context ...at, after fixing a subset of the variables, the resulting subproblem is convex and can be solved easily. There are three methods to implement this approach. 47 Generalized Benders decomposition (GBD) =-=[67, 77, 29]-=- computes at each iteration an upper bound on the solution sought by solving a primal problem and a lower bound on a master problem. The primal problem corresponds to the original problem with fixed d... |

219 |
Cooling schedules for optimal annealing
- Hajek
- 1988
(Show Context)
Citation Context ...i P T (i, j) if i # = i 0 otherwise, (3.3) and the corresponding transition matrix is P T = [P T (i, i # )]. It is assumed that, by choosing neighborhood S i properly, the Markov chain is irreducible =-=[1, 90]-=-, meaning that for each pair of solutions i and j, there is a positive probability of reaching j from i in a finite number of steps. Consider the sequence of temperatures {T k , k = 0, 1, 2,s}, where ... |

211 | Evolutionary programming made faster
- Yao, Liu, et al.
(Show Context)
Citation Context ...vector #, where # i denotes the maximum possible perturbation along # i . 4.2 Generation of Trial Points Three general distributions used in SA to generate trial points include uniform [49], Gaussian =-=[45, 222]-=-, and Cauchy [51]. Examples of other methods are logarithmic explorations [50] and tree annealing [33, 32] that organize neighborhoods in a tree. Such methods only work well for problems with specific... |

208 |
Global Optimization
- Torn, Zilinskas
- 1989
(Show Context)
Citation Context ...ence, reduce oscillations, and greatly speed up convergence. 13 Chapter 2 Previous Work Active research in the past four decades has produced a variety of methods for solving general constrained NLPs =-=[185, 106, 69, 91, 127]-=-. They fall into one of two general formulations, direct solution or transformation-based. The former aims to directly solve constrained NLP (1.1) by searching its feasible regions, while the latter f... |

195 |
Modelling a genetic algorithm with markov chains
- Nix, Vose
- 1992
(Show Context)
Citation Context ... methods focus too much on improving current solutions, they have small chances to overcome rugged search terrains and deep local minima and get stuck at local minima easily. Genetic algorithms (GAs) =-=[85, 71, 127, 167, 140, 147, 94]-=- maintain a population of points in every generation and uses some genetic operators, such as cross-overs and mutations, to generate new points. These old and new points compete for survival in the ne... |

182 |
An introduction to simulated evolutionary optimization
- Fogel
- 1994
(Show Context)
Citation Context ... methods focus too much on improving current solutions, they have small chances to overcome rugged search terrains and deep local minima and get stuck at local minima easily. Genetic algorithms (GAs) =-=[85, 71, 127, 167, 140, 147, 94]-=- maintain a population of points in every generation and uses some genetic operators, such as cross-overs and mutations, to generate new points. These old and new points compete for survival in the ne... |

181 | Very fast simulated re-annealing
- Ingber
- 1989
(Show Context)
Citation Context ...There are many cooling schedules developed for SA. These include logarithmic annealing schedules [1, 90, 28], schedules inversely proportional to annealing steps [183], simulated quenching scheduling =-=[109, 111]-=-, geometric cooling schedules [?, 153], constant annealing [30], arithmetic annealing [136, 159], polynomial-time cooling [2, 1], adaptive temperature scheduling based on acceptance ratio of bad moves... |

174 |
Global Optimization
- Horst, Tuy
- 1993
(Show Context)
Citation Context ...tarting point #(k = 0) # # search procedure # generates a sequence of iterative points, #(1), #(2), , #(k), in search space# until some stopping conditions hold. # is called a deterministic procedure =-=[105, 225]-=- if #(k) is generated deterministically. Otherwise, # is called a probabilistic or stochastic procedure. Finding CGM of (1.1) is challenging as well as di#cult. First, f(x), h(x), and g(x) may be non-... |

173 | Convergence analysis of canonical genetic algorithm
- Rudolph
- 1994
(Show Context)
Citation Context |

164 |
Minirnizing multimodal functions of continuous variables with the 'Simulated Annealing' algorithm
- Corana, Marchesi, et al.
- 1987
(Show Context)
Citation Context ...ed in SA allows trial points to di#er from the current point in one variable, because it has higher probability of accepting trial points than those neighborhoods with more than one variables changed =-=[49]-=-. Here we adopt the same strategy in both the variable and the Lagrangemultiplier subspaces. In our implementation, we choose a simple neighborhood N 1 dn (x) as the set of points x # that di#er from ... |

152 | Space-frequency Quantization for Wavelet Image Coding
- Xiong, Ramchandran, et al.
- 1997
(Show Context)
Citation Context ...n image progressively, whereas set partitioning in hierarchical tree (SPHIT) [168] further enhances its implementation. The rate-distortion of bit allocation can also be incorporated into a zero-tree =-=[220]-=- to get better performance. Instead of using a priori knowledge about transform coe#cients, finding an optimized significance tree (or map) is obtained from a set of training images [53]. Note that al... |

144 |
Wavelet Filter evaluation for image compression
- Villasenor, Belzer, et al.
- 1995
(Show Context)
Citation Context ...age, any tree structure, and any quantization method, and (c) 123 that if filter design criteria are carefully chosen, the filters will perform well in general, such as Daubechies' 9/7 wavelet filter =-=[12, 193]-=-. The disadvantage is that such filters may not perform the best for any given image. The second design strategy is to optimize a design by jointly designing subband filters, splitting tree structure,... |

140 |
An outer-approximation algorithm for a class of mixed-integer nonlinear programs
- Duran, Grossmann
- 1986
(Show Context)
Citation Context ...y applicable to a class of MINLPs with restrictions on their variable space, such as a nonempty and convex continuous subspace with convex objective and constraint functions. Outer approximation (OA) =-=[62, 61]-=- solves constrained MINLPs by a sequence of approximations where each approximated subproblem contains the original feasible region. OA is similar to GBD except that the master problem is formulated b... |

136 |
Some Guidelines for Genetic Algorithms with Penalty Functions
- Richardson, Palmer, et al.
- 1989
(Show Context)
Citation Context ...le region may lead to poor solutions. In addition, it is very di#cult or expensive to project a trajectory into feasible regions for nonlinear constraints. Global Search. Rejecting/discarding methods =-=[110, 14, 160, 154]-=- are stochastic procedures. They iteratively generate random points and only accept feasible points, while dropping infeasible points during their search. Although they are simple and easy to implemen... |

130 | Mimic: Finding optima by estimating probability densities
- Bonet, Jr, et al.
- 1997
(Show Context)
Citation Context ... [171, 169, 96, 177], adaptive multi-starts [34], tabu search [82, 25], guided local search (GLS) [195], ant colony 22 system [60, 59], population-based incremental learning [20] and its enhancements =-=[36, 21, 22]-=-, as well as Bayesian methods [134, 196, 135]. Multi-start [171, 169, 96, 177] restarts a search by randomly generating new starting points after it gets trapped at local minima. This method may not f... |

125 |
Nonlinear optimization. Complexity issues
- Vavasis
- 1991
(Show Context)
Citation Context ... and methods developed for solving continuous problems. Last, there may be a large number of CLM, trapping trajectories that only utilize local information. Constrained global optimization is NP-hard =-=[105, 191]-=-, because it takes exponential time to verify whether a feasible solution is optimal or not for a general constrained NLP. This is true even for quadratic programming problems with linear or box const... |

123 |
CUTEr, a constrained and unconstrained testing environment, revisited
- Toint
- 2003
(Show Context)
Citation Context ...Versions We have chosen three sets of continuous constrained benchmarks: s 1 ) ten problems G1G10 [133, 119], s 2 ) a collection of optimization benchmarks [68], and s 3 ) selected problems from CUTE =-=[37]-=-, a constrained and unconstrained testing environment. These problems have objective functions of various types (linear, quadratic, cubic, polynomial, and nonlinear) and linear/nonlinear constraints o... |

122 | Generalized benders decomposition - Geoffrion - 1972 |

121 |
Pardalos. A collection of test problems for constrained global optimization algorithms
- Floudas, M
- 1990
(Show Context)
Citation Context ... . . . . . . . . . . . . . . . . . . 99 xi 5.2 Performance comparison of DONLP2 (SQP) and CSA in solving derived discrete constrained NLPs from Floudas and Pardalos' continuous constrained benchmarks =-=[68]-=-. CSA is based on (Cauchy 1 , S-uniform, M) and # = 0.8. All times are in seconds on a Pentium-III 500-MHz computer running Solaris 7. `-' stands for no feasible solution found for the specified solut... |

120 |
A systematized collection of ODE solvers
- ODEPACK
- 1983
(Show Context)
Citation Context ... d dt #(t) = # # L o (x(t), #(t), (t)) (A.6) d dt (t) = #sL o (x(t), #(t), (t)) Starting from an initial point (x(t = 0), (t = 0)), we solve (A.6) using the ordinary di#erential equation solver LSODE =-=[98]-=- and obtain a search trajectory (x(t), (t)). When a CLM cn is on the boundary of a feasible region, the dynamic equations (A.6) approach it from both the inside and outside of the feasible region. We ... |

115 |
Neural Networks for Optimization and Signal Processing
- Cichocki, Unbehauen
- 1994
(Show Context)
Citation Context ...ent conditions is a subset of the set of saddle points. Global Search. Various Lagrangian methods aim to locate CLM cn based on solving the first-order necessary conditions (2.6). First-order methods =-=[125, 145, 69, 224, 46, 189]-=- do both gradient descents in the original-variable subspace and gradient ascents in the Lagrangemultiplier subspace and can be written as a dynamic system that consists of a set of ordinary di#erenti... |