Results 1  10
of
123
Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
, 2010
"... ..."
(Show Context)
On the DouglasRachford splitting method and the proximal point algorithm for maximal monotone operators
, 1992
"... ..."
A Modified ForwardBackward Splitting Method For Maximal Monotone Mappings
 SIAM J. Control Optim
, 1998
"... We consider the forwardbackward splitting method for finding a zero of the sum of two maximal monotone mappings. This method is known to converge when the inverse of the forward mapping is strongly monotone. We propose a modification to this method, in the spirit of the extragradient method for mon ..."
Abstract

Cited by 94 (0 self)
 Add to MetaCart
(Show Context)
We consider the forwardbackward splitting method for finding a zero of the sum of two maximal monotone mappings. This method is known to converge when the inverse of the forward mapping is strongly monotone. We propose a modification to this method, in the spirit of the extragradient method for monotone variational inequalities, under which the method converges assuming only the forward mapping is monotone and (Lipschitz) continuous on some closed convex subset of its domain. The modification entails an additional forward step and a projection step at each iteration. Applications of the modified method to decomposition in convex programming and monotone variational inequalities are discussed.
FIXEDPOINT CONTINUATION FOR ℓ1MINIMIZATION: METHODOLOGY AND CONVERGENCE
"... We present a framework for solving largescale ℓ1regularized convex minimization problem: min �x�1 + µf(x). Our approach is based on two powerful algorithmic ideas: operatorsplitting and continuation. Operatorsplitting results in a fixedpoint algorithm for any given scalar µ; continuation refers ..."
Abstract

Cited by 67 (9 self)
 Add to MetaCart
We present a framework for solving largescale ℓ1regularized convex minimization problem: min �x�1 + µf(x). Our approach is based on two powerful algorithmic ideas: operatorsplitting and continuation. Operatorsplitting results in a fixedpoint algorithm for any given scalar µ; continuation refers to approximately following the path traced by the optimal value of x as µ increases. In this paper, we study the structure of optimal solution sets; prove finite convergence for important quantities; and establish qlinear convergence rates for the fixedpoint algorithm applied to problems with f(x) convex, but not necessarily strictly convex. The continuation framework, motivated by our convergence results, is demonstrated to facilitate the construction of practical algorithms.
A primaldual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
, 2013
"... We propose a new firstorder splitting algorithm for solving jointly the primal and dual formulations of largescale convex minimization problems involving the sum of a smooth function with Lipschitzian gradient, a nonsmooth proximable function, and linear composite functions. This is a full splitti ..."
Abstract

Cited by 60 (10 self)
 Add to MetaCart
We propose a new firstorder splitting algorithm for solving jointly the primal and dual formulations of largescale convex minimization problems involving the sum of a smooth function with Lipschitzian gradient, a nonsmooth proximable function, and linear composite functions. This is a full splitting approach, in the sense that the gradient and the linear operators involved are applied explicitly without any inversion, while the nonsmooth functions are processed individually via their proximity operators. This work brings together and notably extends several classical splitting schemes, like the forward–backward and Douglas–Rachford methods, as well as the recent primal–dual method of Chambolle and Pock designed for problems with linear composite terms.
A MONOTONE + SKEW SPLITTING MODEL FOR COMPOSITE MONOTONE INCLUSIONS IN DUALITY
, 2011
"... The principle underlying this paper is the basic observation that the problem of simultaneously solving a large class of composite monotone inclusions and their duals can be reduced to that of finding a zero of the sum of a maximally monotone operator and a linear skewadjoint operator. An algorith ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
(Show Context)
The principle underlying this paper is the basic observation that the problem of simultaneously solving a large class of composite monotone inclusions and their duals can be reduced to that of finding a zero of the sum of a maximally monotone operator and a linear skewadjoint operator. An algorithmic framework is developed for solving this generic problem in a Hilbert space setting. New primaldual splitting algorithms are derived from this framework for inclusions involving composite monotone operators, and convergence results are established. These algorithms draw their simplicity and efficacy from the fact that they operate in a fully decomposed fashion in the sense that the monotone operators and the linear transformations involved are activated separately at each iteration. Comparisons with existing methods are made and applications to composite variational problems are demonstrated.
Online Alternating Direction Method
 In ICML
, 2012
"... Online optimization has emerged as powerful tool in large scale optimization. In this paper, we introduce efficient online algorithms based on the alternating directions method (ADM). We introduce a new proof technique for ADM in the batch setting, which yields the O(1/T) convergence rate of ADM and ..."
Abstract

Cited by 39 (9 self)
 Add to MetaCart
(Show Context)
Online optimization has emerged as powerful tool in large scale optimization. In this paper, we introduce efficient online algorithms based on the alternating directions method (ADM). We introduce a new proof technique for ADM in the batch setting, which yields the O(1/T) convergence rate of ADM and forms the basis of regret analysis in the online setting. We consider two scenarios in the online setting, based on whether the solution needs to lie in the feasible set or not. In both settings, we establish regret bounds for both the objective function as well as constraint violation for general and strongly convex functions. Preliminary results are presented to illustrate the performance of the proposed algorithms. 1.
Hankel matrix rank minimization with applications to system identification and realization
, 2011
"... In this paper, we introduce a flexible optimization framework for nuclear norm minimization of matrices with linear structure, including Hankel, Toeplitz and moment structures, and catalog applications from diverse fields under this framework. We discuss various firstorder methods for solving the ..."
Abstract

Cited by 39 (6 self)
 Add to MetaCart
(Show Context)
In this paper, we introduce a flexible optimization framework for nuclear norm minimization of matrices with linear structure, including Hankel, Toeplitz and moment structures, and catalog applications from diverse fields under this framework. We discuss various firstorder methods for solving the resulting optimization problem, including alternating direction methods, proximal point algorithm and gradient projection methods. We perform computational experiments to compare these methods on system identification problem and system realization problem. For the system identification problem, the gradient projection method (accelerated by Nesterov’s extrapolation techniques) usually outperforms other firstorder methods in terms of CPU time on both real and simulated data; while for the system realization problem, the alternating direction method, as applied to a certain primal reformulation, usually outperforms other firstorder methods in terms of CPU time.