Results 1  10
of
21
Minimizing a Quadratic Over a Sphere
 SIAM J. Optim
, 2000
"... A new method, the sequential subspace method (SSM), is developed for the problem of minimizing a quadratic over a sphere. In our scheme, the quadratic is minimized over a subspace which is adjusted in successive iterations to ensure convergence to an optimum. When a sequential quadratic programming ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
A new method, the sequential subspace method (SSM), is developed for the problem of minimizing a quadratic over a sphere. In our scheme, the quadratic is minimized over a subspace which is adjusted in successive iterations to ensure convergence to an optimum. When a sequential quadratic programming iterate is included in the subspace, convergence is locally quadratic. Numerical comparisons with other recent methods are given.
On the solution of the Tikhonov regularization of the total least squares problem
 SIAM J. Optim
"... Abstract. Total least squares (TLS) is a method for treating an overdetermined system of linear equations Ax ≈ b, where both the matrix A and the vector b are contaminated by noise. Tikhonov regularization of the TLS (TRTLS) leads to an optimization problem of minimizing the sum of fractional quadra ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
Abstract. Total least squares (TLS) is a method for treating an overdetermined system of linear equations Ax ≈ b, where both the matrix A and the vector b are contaminated by noise. Tikhonov regularization of the TLS (TRTLS) leads to an optimization problem of minimizing the sum of fractional quadratic and quadratic functions. As such, the problem is nonconvex. We show how to reduce the problem to a single variable minimization of a function G over a closed interval. Computing a value and a derivative of G consists of solving a single trust region subproblem. For the special case of regularization with a squared Euclidean norm we show that G is unimodal and provide an alternative algorithm, which requires only one spectral decomposition. A numerical example is given to illustrate the effectiveness of our method.
Global convergence of SSM for minimizing a quadratic over a sphere
 Math. Comp
, 2004
"... Abstract. In an earlier paper [Minimizing a quadratic over a sphere, SIAM J. Optim., 12 (2001), 188–208], we presented the sequential subspace method (SSM) for minimizing a quadratic over a sphere. This method generates approximations to a minimizer by carrying out the minimization over a sequence o ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract. In an earlier paper [Minimizing a quadratic over a sphere, SIAM J. Optim., 12 (2001), 188–208], we presented the sequential subspace method (SSM) for minimizing a quadratic over a sphere. This method generates approximations to a minimizer by carrying out the minimization over a sequence of subspaces that are adjusted after each iterate is computed. We showed in this earlier paper that when the subspace contains a vector obtained by applying one step of Newton’s method to the firstorder optimality system, SSM is locally, quadratically convergent, even when the original problem is degenerate with multiple solutions and with a singular Jacobian in the optimality system. In this paper, we prove (nonlocal) convergence of SSM to a global minimizer whenever each SSM subspace contains the following three vectors: (i) the current iterate, (ii) the gradient of the cost function evaluated at the current iterate, and (iii) an eigenvector associated with the smallest eigenvalue of the cost function Hessian. For nondegenerate problems, the convergence rate is at least linear when vectors (i)–(iii) are included in the SSM subspace. 1.
Iterative Linear Algebra for Constrained Optimization
, 2005
"... Each step of an interior point method for nonlinear optimization requires the solution of a symmetric indefinite linear system known as a KKT system, or more generally, a saddle point problem. As the problem size increases, direct methods become prohibitively expensive to use for solving these probl ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Each step of an interior point method for nonlinear optimization requires the solution of a symmetric indefinite linear system known as a KKT system, or more generally, a saddle point problem. As the problem size increases, direct methods become prohibitively expensive to use for solving these problems; this leads to iterative solvers being the only viable alternative. In this thesis we consider iterative methods for solving saddle point systems and show that a projected preconditioned conjugate gradient method can be applied to these indefinite systems. Such a method requires the use of a specific class of preconditioners, (extended) constraint preconditioners, which exactly replicate some parts of the saddle point system that we wish to solve. The standard method for using constraint preconditioners, at least in the optimization community, has been to choose the constraint
Solving the quadratic trustregion subproblem in a lowmemory BFGS framework
 OPTIMIZATION METHODS AND SOFTWARE
, 2008
"... We present a new matrixfree method for the largescale trustregion subproblem, assuming that the approximate Hessian is updated by the LBFGS formula with m = 1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
We present a new matrixfree method for the largescale trustregion subproblem, assuming that the approximate Hessian is updated by the LBFGS formula with m = 1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix whose inverse can be expressed analytically, without using factorization. Consequently, a direction of negative curvature can be computed immediately by applying the inverse power method. The computation of the trial step is obtained by performing a sequence of inner products and vector summations. Furthermore, it immediately follows that the strong convergence properties of trust region methods are preserved. Numerical results are also presented.
A Survey of the Trust Region Subproblem within a Semidefinite Framework
, 2000
"... Trust region subproblems arise within a class of unconstrained methods called trust region methods. The subproblems consist of minimizing a quadratic function subject to a norm constraint. This thesis is a survey of dierent methods developed to nd an approximate solution to the subproblem. We study ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Trust region subproblems arise within a class of unconstrained methods called trust region methods. The subproblems consist of minimizing a quadratic function subject to a norm constraint. This thesis is a survey of dierent methods developed to nd an approximate solution to the subproblem. We study the wellknown method of More and Sorensen [18] and two recent methods for large sparse subproblems: the socalled Lanczos method of Gould et al. [7] and the Rendl and Wolkowicz algorithm [31]. The common ground to explore these methods will be semidenite programming. This approach has been used by Rendl and Wolkowicz [31] to explain their method and the More and Sorensen algorithm; we extend this work to the Lanczos method. The last chapter of this thesis is dedicated to some improvements done to the Rendl and Wolkowicz algorithm and to comparisons between the Lanczos method and the Rendl and Wolkowicz algorithm. In particular, we show some weakness of the Lanczos method and show that ...
A SUBSPACE MINIMIZATION METHOD FOR THE TRUSTREGION STEP ∗
"... Abstract. We consider methods for largescale unconstrained minimization based on finding an approximate minimizer of a quadratic function subject to a twonorm trustregion constraint. The SteihaugToint method uses the conjugategradient (CG) algorithm to minimize the quadratic over a sequence of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. We consider methods for largescale unconstrained minimization based on finding an approximate minimizer of a quadratic function subject to a twonorm trustregion constraint. The SteihaugToint method uses the conjugategradient (CG) algorithm to minimize the quadratic over a sequence of expanding subspaces until the iterates either converge to an interior point or cross the constraint boundary. However, if the CG method is used with a preconditioner, the SteihaugToint method requires that the trustregion norm be defined in terms of the preconditioning matrix. If a different preconditioner is used for each subproblem, the shape of the trustregion can change substantially from one subproblem to the next, which invalidates many of the assumptions on which standard methods for adjusting the trustregion radius are based. In this paper we propose a method that allows the trustregion norm to be defined independently of the preconditioner. The method solves the inequality constrained trustregion subproblem over a sequence of evolving lowdimensional subspaces. Each subspace includes an accelerator direction defined by a regularized Newton method for satisfying the optimality conditions of a primaldual interior method. A crucial property of this direction is that it can be computed by applying the preconditioned CG method to a positivedefinite system in both the primal and dual variables of the trustregion subproblem. Numerical experiments on problems from the CUTEr test collection indicate that the method can require significantly fewer function evaluations than other methods. In addition, experiments with generalpurpose preconditioners show that it is possible to significantly reduce the number of matrixvector products relative to those required without preconditioning. Key words. Largescale unconstrained optimization, trustregion methods, conjugategradient methods, Krylov methods, preconditioning. AMS subject classifications. 49M37, 65F10, 65K05, 65K10, 90C06, 90C26, 90C30
Secondordercone constraints for extended trustregion subproblems
, 2011
"... The classical trustregion subproblem (TRS) minimizes a nonconvex quadratic objective over the unit ball. In this paper, we consider extensions of TRS having extra constraints. When two parallel cuts are added to TRS, we show that the resulting nonconvex problem has an exact representation as a semi ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The classical trustregion subproblem (TRS) minimizes a nonconvex quadratic objective over the unit ball. In this paper, we consider extensions of TRS having extra constraints. When two parallel cuts are added to TRS, we show that the resulting nonconvex problem has an exact representation as a semidefinite program with additional linear and secondordercone constraints. For the case where an additional ellipsoidal constraint is added to TRS, resulting in the “two trustregion subproblem ” (TTRS), we provide a new relaxation including secondordercone constraints that strengthens the usual SDP relaxation.
Training Deep and Recurrent Networks with HessianFree Optimization
, 2012
"... HessianFree optimization (HF) is an approach for unconstrained minimization of realvalued smooth objective functions. Like standard Newton’s method, it uses local quadratic approximations to generate update proposals. It belongs to the broad class of approximate Newton methods that are practical f ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
HessianFree optimization (HF) is an approach for unconstrained minimization of realvalued smooth objective functions. Like standard Newton’s method, it uses local quadratic approximations to generate update proposals. It belongs to the broad class of approximate Newton methods that are practical for problems