Results 1 
5 of
5
On the convergence of the Newton/logbarrier method
 Preprint ANL/MCSP681 0897, Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, Ill
, 1997
"... Abstract. In the Newton/logbarrier method, Newton steps are taken for the logbarrier function for a xed value of the barrier parameter until a certain convergence criterion is satis ed. The barrier parameter is then decreased and the Newton process is repeated. A naive analysis indicates that Newt ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Abstract. In the Newton/logbarrier method, Newton steps are taken for the logbarrier function for a xed value of the barrier parameter until a certain convergence criterion is satis ed. The barrier parameter is then decreased and the Newton process is repeated. A naive analysis indicates that Newton's method does not exhibit superlinear convergence to the minimizer of each instance of the logbarrier function until it reaches a very small neighborhood of the minimizer. By partitioning according to the subspace of active constraint gradients, however, we show that this neighborhood is actually quite large, thus explaining why reasonably fast local convergence can be attained in practice. Moreover, we show that the overall convergence rate of the Newton/logbarrier algorithm is superlinear in the number of function/derivative evaluations, provided that the nonlinear program is formulated with a linear objective and that the schedule for decreasing the barrier parameter is related in a certain way to the convergence criterion for each Newton process. 1.
Methods for nonlinear constraints in optimization calculations
 THE STATE OF THE ART IN NUMERICAL ANALYSIS
, 1996
"... ..."
Evolutionary computation techniques for nonlinear programming problems
 International Transactions of Operational Research
, 1994
"... zbyszek�mosaic.uncc.edu The paper presents several evolutionary computation techniques and discusses their applicability to nonlinear programming problems. On the basis of this presen� tation we discuss also a construction of a new hybrid optimization system � Genocop II � and present its experiment ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
zbyszek�mosaic.uncc.edu The paper presents several evolutionary computation techniques and discusses their applicability to nonlinear programming problems. On the basis of this presen� tation we discuss also a construction of a new hybrid optimization system � Genocop II � and present its experimental results on a few test cases �nonlinear programming problems�. Keywords � evolutionary computation � genetic algorithm � random algorithm � optimiza� tion technique � nonlinear programming � constrained optimization.
Iterative Methods for IllConditioned Linear Systems From Optimization
, 1998
"... Preconditioned conjugategradient methods are proposed for solving the illconditioned linear systems which arise in penalty and barrier methods for nonlinear minimization. The preconditioners are chosen so as to isolate the dominant cause of ill conditioning. The methods are stablized using a restr ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Preconditioned conjugategradient methods are proposed for solving the illconditioned linear systems which arise in penalty and barrier methods for nonlinear minimization. The preconditioners are chosen so as to isolate the dominant cause of ill conditioning. The methods are stablized using a restricted form of iterative refinement. Numerical results illustrate the approaches considered. 1 Email : n.gould@rl.ac.uk 2 Current reports available from "http://www.rl.ac.uk/departments/ccd/numerical/reports/reports.html". Department for Computation and Information Atlas Centre Rutherford Appleton Laboratory Oxfordshire OX11 0QX August 26, 1998. 1 INTRODUCTION 1 1 Introduction Let A and H be, respectively, fullrank m by n (m n) and symmetric n by n real matrices. Suppose furthermore that any nonzero coefficients in this data are modest, that is the data is O(1). (1) We consider the iterative solution of the linear system (H +A T D \Gamma1 A)x = b (1.1) where b is modest an...
and
"... In this paper we discuss a construction of Genocop II, a hybrid optimization system for general nonlinear programming problems. We present the rst experimental results of the system on ve test cases. These include a variety of objective functions with nonlinear constraints. The results are encouragi ..."
Abstract
 Add to MetaCart
In this paper we discuss a construction of Genocop II, a hybrid optimization system for general nonlinear programming problems. We present the rst experimental results of the system on ve test cases. These include a variety of objective functions with nonlinear constraints. The results are encouraging. 1.