Results 11  20
of
91
A BoxConstrained Optimization Algorithm With Negative Curvature Directions and Spectral Projected Gradients
, 2001
"... A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. Numerical results are presented. Keywords: box constrained minimization, active set methods, spectral projected gradients, dogleg path methods. AMS Subject Classication: 49M07, 49M10, 65K, 90C06, 90C20. 1
A new active set algorithm for box constrained Optimization
 SIAM Journal on Optimization
, 2006
"... Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. For a nondegenerate stationary point, the algorithm eventually reduces to unconstrained optimization without restarts. Similarly, for a degenerate stationary point, where the strong secondorder sufficient optimality condition holds, the algorithm eventually reduces to unconstrained optimization without restarts. A specific implementation of the ASA is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) algorithm for the gradient projection step and the recently developed conjugate gradient algorithm CG DESCENT for unconstrained optimization. Numerical experiments are presented using box constrained problems in the CUTEr and MINPACK2 test problem libraries. Key words. nonmonotone gradient projection, box constrained optimization, active set algorithm,
On the BarzilaiBorwein method
, 2001
"... A review is given of the underlying theory and recent developments in regard to the BarzilaiBorwein steepest descent method for large scale unconstrained optimization. One aim is to assess why the method seems to be comparable in practical eciency to conjugate gradient methods. The importance of ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A review is given of the underlying theory and recent developments in regard to the BarzilaiBorwein steepest descent method for large scale unconstrained optimization. One aim is to assess why the method seems to be comparable in practical eciency to conjugate gradient methods. The importance of using a nonmonotone line search is stressed, although some suggestions are made as to why the modi cation proposed by Raydan [22] often does not usually perform well for an illconditioned problem. Extensions for box constraints are discussed. A number of interesting open questions are put forward. Keywords BarzilaiBorwein method, steepest descent, elliptic systems, unconstrained optimization. 1
Restricted optimization: a clue to a fast and accurate implementation of the Common Reflection Surface Stack method
 JOURNAL OF APPLIED GEOPHYSICS
, 1999
"... For a fixed, central ray in an isotropic elastic or acoustic media, traveltime moveouts of rays in its vicinity can be described in terms of a certain number of parameters that refer to the central ray only. The determination of these parameters out of multicoverage data leads to very powerful al ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
For a fixed, central ray in an isotropic elastic or acoustic media, traveltime moveouts of rays in its vicinity can be described in terms of a certain number of parameters that refer to the central ray only. The determination of these parameters out of multicoverage data leads to very powerful algorithms that can be used for several imaging and inversion processes. Assuming twodimensional propagation, the traveltime expressions depend on three parameters directly related to the geometry of the unknown model in the vicinity of the central ray. We present a new method to extract these parameters out of coherency analysis applied directly to the data. It uses (a) fast oneparameter searches on different sections extracted from the multicoverage data to derive initial values of the sections parameters, and (b) the application of a recently introduced Spectral Projected Gradient optimization algorithm for the final parameter estimation. Application of the method on a synthetic example shows an excellent performance of the algorithm both in accuracy and efficiency. The results obtained so far indicate that the algorithm may be a feasible option to solve the corresponding, harder, full threedimensional problem, in which eight parameters, instead of three, are required.
Gradient projection methods for quadratic programs and applications in training support vector machines
 Optim. Methods Softw
, 2005
"... Gradient projection methods based on the BarzilaiBorwein spectral steplength choices are considered for quadratic programming problems with simple constraints. Wellknown nonmonotone spectral projected gradient methods and variable projection methods are discussed. For both approaches the behavior ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
Gradient projection methods based on the BarzilaiBorwein spectral steplength choices are considered for quadratic programming problems with simple constraints. Wellknown nonmonotone spectral projected gradient methods and variable projection methods are discussed. For both approaches the behavior of different combinations of the two spectral steplengths is investigated. A new adaptive steplength alternating rule is proposed that becomes the basis for a generalized version of the variable projection method (GVPM). Convergence results are given for the proposed approach and its effectiveness is shown by means of an extensive computational study on several test problems, including the special quadratic programs arising in training support vector machines. Finally, the GVPM behavior as inner QP solver in decomposition techniques for largescale support vector machines is also evaluated.
Optimizing the Packing of Cylinders into a Rectangular Container: A Nonlinear Approach
, 2003
"... The container loading problem has important industrial and commercial applications. An increase in the number of items in a container leads to a decrease in cost. For this reason the related optimization problem is of economic importance. In this work, a procedure based on a nonlinear decision pr ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
The container loading problem has important industrial and commercial applications. An increase in the number of items in a container leads to a decrease in cost. For this reason the related optimization problem is of economic importance. In this work, a procedure based on a nonlinear decision problem to solve the cylinder packing problem with identical diameters is presented. This formulation is based on the fact that the centers of the cylinders have to be inside the rectangular box de ned by the base of the container (a radius far from the frontier) and far from each other at least one diameter. With this basic premise the procedure tries to nd the maximum number of cylinder centers that satisfy these restrictions. The continuous nature of the problem is one of the reasons that motivated this study. A comparative study with other methods of the literature is presented and better results are achieved.
Structure Learning of Bayesian Networks using Constraints
"... This paper addresses exact learning of Bayesian network structure from data and expert’s knowledge based on score functions that are decomposable. First, it describes useful properties that strongly reduce the time and memory costs of many known methods such as hillclimbing, dynamic programming and ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
This paper addresses exact learning of Bayesian network structure from data and expert’s knowledge based on score functions that are decomposable. First, it describes useful properties that strongly reduce the time and memory costs of many known methods such as hillclimbing, dynamic programming and sampling variable orderings. Secondly, a branch and bound algorithm is presented that integrates parameter and structural constraints with data in a way to guarantee global optimality with respect to the score function. It is an anytime procedure because, if stopped, it provides the best current solution and an estimation about how far it is from the global solution. We show empirically the advantages of the properties and the constraints, and the applicability of the algorithm to large data sets (up to one hundred variables) that cannot be handled by other current methods (limited to around 30 variables). 1.
On the Asymptotic Behaviour of some New Gradient Methods
 Mathematical Programming
, 2003
"... The BarzilaiBorwein (BB) gradient method, and some other new gradient methods have shown themselves to be competitive with conjugate gradient methods for solving large dimension nonlinear unconstrained optimization problems. Little is known about the asymptotic behaviour, even when applied to n ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
The BarzilaiBorwein (BB) gradient method, and some other new gradient methods have shown themselves to be competitive with conjugate gradient methods for solving large dimension nonlinear unconstrained optimization problems. Little is known about the asymptotic behaviour, even when applied to n dimensional quadratic functions, except in the case that n = 2. We show in the quadratic case how it is possible to compute this asymptotic behaviour, and observe that as n increases there is a transition from superlinear to linear convergence at some value of n 4, depending on the method. By neglecting certain terms in the recurrence relations we de ne simpli ed versions of the methods, which are able to predict this transition. The simpli ed methods also predict that for larger values of n, the eigencomponents of the gradient vectors converge in modulus to a common value, which is a similar to a property observed to hold in the real methods. Some unusual and interesting recurrence relations are analysed in the course of the study.
Practical activeset Euclidian trustregion method with spectral projected gradients for boundconstrained minimization, Optimization 54
 SIAM Journalon Optimization
, 2005
"... A practical activeset method for boundconstrained minimization is introduced. Within the current face the classical Euclidian trustregion method is employed. Spectral projected gradient directions are used to abandon faces. Numerical results are presented. Key words: Boundconstrained optimizatio ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
A practical activeset method for boundconstrained minimization is introduced. Within the current face the classical Euclidian trustregion method is employed. Spectral projected gradient directions are used to abandon faces. Numerical results are presented. Key words: Boundconstrained optimization, projected gradient, spectral gradient, trust regions. 1
A nonmonotone line search technique and its application to unconstrained optimization
 SIAM J. Optim
, 2004
"... Abstract. A new nonmonotone line search algorithm is proposed and analyzed. In our scheme, we require that an average of the successive function values decreases, while the traditional nonmonotone approach of Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707–716] requires tha ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
Abstract. A new nonmonotone line search algorithm is proposed and analyzed. In our scheme, we require that an average of the successive function values decreases, while the traditional nonmonotone approach of Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707–716] requires that a maximum of recent function values decreases. We prove global convergence for nonconvex, smooth functions, and Rlinear convergence for strongly convex functions. For the LBFGS method and the unconstrained optimization problems in the CUTE library, the new nonmonotone line search algorithm used fewer function and gradient evaluations, on average, than either the monotone or the traditional nonmonotone scheme.