Results 1  10
of
29
A Singular Value Thresholding Algorithm for Matrix Completion
, 2008
"... This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem, and arises in many important applications as in the task of reco ..."
Abstract

Cited by 204 (12 self)
 Add to MetaCart
This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem, and arises in many important applications as in the task of recovering a large matrix from a small subset of its entries (the famous Netflix problem). Offtheshelf algorithms such as interior point methods are not directly amenable to large problems of this kind with over a million unknown entries. This paper develops a simple firstorder and easytoimplement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank. The algorithm is iterative and produces a sequence of matrices {X k, Y k} and at each step, mainly performs a softthresholding operation on the singular values of the matrix Y k. There are two remarkable features making this attractive for lowrank matrix completion problems. The first is that the softthresholding operation is applied to a sparse matrix; the second is that the rank of the iterates {X k} is empirically nondecreasing. Both these facts allow the algorithm to make use of very minimal storage space and keep the computational cost of each iteration low. On
Nonmonotone spectral projected gradient methods on convex sets
 SIAM Journal on Optimization
, 2000
"... Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone lin ..."
Abstract

Cited by 135 (25 self)
 Add to MetaCart
Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone line search. In particular, the nonmonotone strategy is combined with the spectral gradient choice of steplength to accelerate the convergence process. In addition to the classical projected gradient nonlinear path, the feasible spectral projected gradient is used as a search direction to avoid additional trial projections during the onedimensional search process. Convergence properties and extensive numerical results are presented.
Twometric projection methods for constrained optimization
 SIAM Journal on Control and Optimization
, 1984
"... Abstract. This paper is concerned with the problem min {f(x)lx X} where X is a convex subset of a linear space H, and f is a smooth realvalued function on H. We propose the class of methods Xk+l P(xk akgk), where P denotes projection on X with respect to a Hilbert space norm II ’ [I, gk denotes th ..."
Abstract

Cited by 44 (2 self)
 Add to MetaCart
Abstract. This paper is concerned with the problem min {f(x)lx X} where X is a convex subset of a linear space H, and f is a smooth realvalued function on H. We propose the class of methods Xk+l P(xk akgk), where P denotes projection on X with respect to a Hilbert space norm II ’ [I, gk denotes the Frechet derivative of f at xk with respect to another Hilbert space norm I " on H, and ak is a positive scalar stepsize. We thus remove an important restriction in the original proposal of Goldstein and Levitin and Pofjak [2], where the norms arid II ’ II must be the same. It is therefore possible to match the norm II " with the structure of X so that the projection operation is simplified while at the same time reserving the option to choose 1. Ik on the basis of approximations to the Hessian of f so as to attain a typically superlinear rate of convergence. The resulting methods are particularly attractive for largescale problems with specially structured constraint sets such as optimal control and nonlinear multicommodity network flow problems. The latter class of problems is discussed in some detail. Key words, constrained optimization, gradient projection, convergence analysis, multicommodity flow problems, largescale optimization
Linear convergence of iterative softthresholding
 J. Fourier Anal. Appl
"... ABSTRACT. In this article a unified approach to iterative softthresholding algorithms for the solution of linear operator equations in infinite dimensional Hilbert spaces is presented. We formulate the algorithm in the framework of generalized gradient methods and present a new convergence analysis ..."
Abstract

Cited by 33 (9 self)
 Add to MetaCart
ABSTRACT. In this article a unified approach to iterative softthresholding algorithms for the solution of linear operator equations in infinite dimensional Hilbert spaces is presented. We formulate the algorithm in the framework of generalized gradient methods and present a new convergence analysis. As main result we show that the algorithm converges with linear rate as soon as the underlying operator satisfies the socalled finite basis injectivity property or the minimizer possesses a socalled strict sparsity pattern. Moreover it is shown that the constants can be calculated explicitly in special cases (i.e. for compact operators). Furthermore, the techniques also can be used to establish linear convergence for related methods such as the iterative thresholding algorithm for joint sparsity and the accelerated gradient projection method. 1.
A new active set algorithm for box constrained Optimization
 SIAM Journal on Optimization
, 2006
"... Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. For a nondegenerate stationary point, the algorithm eventually reduces to unconstrained optimization without restarts. Similarly, for a degenerate stationary point, where the strong secondorder sufficient optimality condition holds, the algorithm eventually reduces to unconstrained optimization without restarts. A specific implementation of the ASA is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) algorithm for the gradient projection step and the recently developed conjugate gradient algorithm CG DESCENT for unconstrained optimization. Numerical experiments are presented using box constrained problems in the CUTEr and MINPACK2 test problem libraries. Key words. nonmonotone gradient projection, box constrained optimization, active set algorithm,
Theory and implementation of numerical methods based on RungeKutta integration for solving optimal control problems
, 1996
"... ..."
An implementable proximal point algorithmic framework for nuclear norm minimization
, 2010
"... The nuclear norm minimization problem is to find a matrix with the minimum nuclear norm subject to linear and second order cone constraints. Such a problem often arises from the convex relaxation of a rank minimization problem with noisy data, and arises in many fields of engineering and science. In ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
The nuclear norm minimization problem is to find a matrix with the minimum nuclear norm subject to linear and second order cone constraints. Such a problem often arises from the convex relaxation of a rank minimization problem with noisy data, and arises in many fields of engineering and science. In this paper, we study inexact proximal point algorithms in the primal, dual and primaldual forms for solving the nuclear norm minimization with linear equality and second order cone constraints. We design efficient implementations of these algorithms and present comprehensive convergence results. In particular, we investigate the performance of our proposed algorithms in which the inner subproblems are approximately solved by the gradient projection method or the accelerated proximal gradient method. Our numerical results for solving randomly generated matrix completion problems and real matrix completion problems show that our algorithms perform favorably in comparison to several recently proposed stateoftheart algorithms. Interestingly, our proposed algorithms are connected with other algorithms that have been studied in the literature. Key words. Nuclear norm minimization, proximal point method, rank minimization, gradient projection method, accelerated proximal gradient method.
Convergence rates in forwardbackward splitting
, 1989
"... Forwardbackward splitting methods provide a range of approaches to solving largescale optimization problems and variational inequalities in which structure conducive to decomposition can be utilized. Apart from special cases where the forward step is absent and a version of the proximal point alg ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Forwardbackward splitting methods provide a range of approaches to solving largescale optimization problems and variational inequalities in which structure conducive to decomposition can be utilized. Apart from special cases where the forward step is absent and a version of the proximal point algorithm comes out, efforts at evaluating the convergence potential of such methods have so far relied on Lipschitz properties and strong monotonicity, or inverse strong monotonicity, of the mapping involved in the forward step, the perspective mainly being that of projection algorithms. Here convergence is analyzed by a technique that allows properties of the mapping in the backward step to be brought in as well. For the first time in such a general setting, global and local contraction rates are derived, moreover in a form making it possible to determine the optimal step size relative to certain constants associated with the given problem. Insights are thereby gained into the effects of shifting strong monotonicity between the forward and backward mappings when a splitting is selected.
On the linear convergence of descent methods for convex essentially smooth minimization
 SIAM J. Control Optim
, 1992
"... Dedicated to those courageous people who, on June 4, 1989, sacrificed their lives in ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
Dedicated to those courageous people who, on June 4, 1989, sacrificed their lives in
On Iterative Algorithms for Linear Least Squares Problems With Bound Constraints
, 1995
"... Three new iterative methods for the solution of the linear least squares problem with bound constraints are presented and their performance analyzed. The first is a modification of a method proposed by Lotstedt, while the two others are characterized by a technique allowing for fast active set chang ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
Three new iterative methods for the solution of the linear least squares problem with bound constraints are presented and their performance analyzed. The first is a modification of a method proposed by Lotstedt, while the two others are characterized by a technique allowing for fast active set changes resulting in noticeable improvements on the speed at which constraints active at the solution are identified. The numerical efficiency of these algorithms is experimentally studied, with particular emphasis on dependence of the starting point choice and the use of preconditioning for illconditioned problems.