Results 1  10
of
292
Proximal Splitting Methods in Signal Processing
"... The proximity operator of a convex function is a natural extension of the notion of a projection operator onto a convex set. This tool, which plays a central role in the analysis and the numerical solution of convex optimization problems, has recently been introduced in the arena of inverse problems ..."
Abstract

Cited by 264 (32 self)
 Add to MetaCart
(Show Context)
The proximity operator of a convex function is a natural extension of the notion of a projection operator onto a convex set. This tool, which plays a central role in the analysis and the numerical solution of convex optimization problems, has recently been introduced in the arena of inverse problems and, especially, in signal processing, where it has become increasingly important. In this paper, we review the basic properties of proximity operators which are relevant to signal processing and present optimization methods based on these operators. These proximal splitting methods are shown to capture and extend several wellknown algorithms in a unifying framework. Applications of proximal methods in signal recovery and synthesis are discussed.
A MONOTONE + SKEW SPLITTING MODEL FOR COMPOSITE MONOTONE INCLUSIONS IN DUALITY
, 2011
"... The principle underlying this paper is the basic observation that the problem of simultaneously solving a large class of composite monotone inclusions and their duals can be reduced to that of finding a zero of the sum of a maximally monotone operator and a linear skewadjoint operator. An algorith ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
(Show Context)
The principle underlying this paper is the basic observation that the problem of simultaneously solving a large class of composite monotone inclusions and their duals can be reduced to that of finding a zero of the sum of a maximally monotone operator and a linear skewadjoint operator. An algorithmic framework is developed for solving this generic problem in a Hilbert space setting. New primaldual splitting algorithms are derived from this framework for inclusions involving composite monotone operators, and convergence results are established. These algorithms draw their simplicity and efficacy from the fact that they operate in a fully decomposed fashion in the sense that the monotone operators and the linear transformations involved are activated separately at each iteration. Comparisons with existing methods are made and applications to composite variational problems are demonstrated.
A parallel inertial proximal optimization methods
 Pac. J. Optim
, 2012
"... The DouglasRachford algorithm is a popular iterative method for finding a zero of a sum of two maximally monotone operators defined on a Hilbert space. In this paper, we propose an extension of this algorithm including inertia parameters and develop parallel versions to deal with the case of a sum ..."
Abstract

Cited by 36 (14 self)
 Add to MetaCart
(Show Context)
The DouglasRachford algorithm is a popular iterative method for finding a zero of a sum of two maximally monotone operators defined on a Hilbert space. In this paper, we propose an extension of this algorithm including inertia parameters and develop parallel versions to deal with the case of a sum of an arbitrary number of maximal operators. Based on this algorithm, parallel proximal algorithms are proposed to minimize over a linear subspace of a Hilbert space the sum of a finite number of proper, lower semicontinuous convex functions composed with linear operators. It is shown that particular cases of these methods are the simultaneous direction method of multipliers proposed by Stetzer et al., the parallel proximal algorithm developed by Combettes and Pesquet, and a parallelized version of an algorithm proposed by Attouch and Soueycatt.
Combettes, A monotone+skew splitting model for composite monotone inclusions in duality
"... ar ..."
(Show Context)
Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems
 SIAM Journal on Optimization
, 2013
"... Abstract. We consider projection algorithms for solving (nonconvex) feasibility problems in Euclidean spaces. Of special interest are the method of alternating projections (AP) and the Douglas– Rachford algorithm (DR). In the case of convex feasibility, firm nonexpansiveness of projection mappings i ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We consider projection algorithms for solving (nonconvex) feasibility problems in Euclidean spaces. Of special interest are the method of alternating projections (AP) and the Douglas– Rachford algorithm (DR). In the case of convex feasibility, firm nonexpansiveness of projection mappings is a global property that yields global convergence of AP and for consistent problems DR. A notion of local subfirm nonexpansiveness with respect to the intersection is introduced for consistent feasibility problems. This, together with a coercivity condition that relates to the regularity of the collection of sets at points in the intersection, yields local linear convergence of AP for a wide class of nonconvex problems and even local linear convergence of nonconvex instances of the DR algorithm.
On variable density compressive sampling
 Signal Processing Letters, IEEE
"... ar ..."
(Show Context)
SMOOTHING AND FIRST ORDER METHODS: A UNIFIED FRAMEWORK
, 2012
"... We propose a unifying framework that combines smoothing approximation with fast first order algorithms for solving nonsmooth convex minimization problems. We prove that independently of the structure of the convex nonsmooth function involved, and of the given fast first order iterative scheme, it ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
(Show Context)
We propose a unifying framework that combines smoothing approximation with fast first order algorithms for solving nonsmooth convex minimization problems. We prove that independently of the structure of the convex nonsmooth function involved, and of the given fast first order iterative scheme, it is always possible to improve the complexity rate and reach an O(ε−1) efficiency estimate by solving an adequately smoothed approximation counterpart. Our approach relies on the combination of the notion of smoothable functions that we introduce with a natural extension of the Moreauinfimal convolution technique along with its connection to the smoothing mechanism via asymptotic functions. This allows for clarification and unification of several issues on the design, analysis, and potential applications of smoothing methods when combined with fast first order algorithms.
iPiano: Inertial Proximal Algorithm for Nonconvex Optimization
, 2014
"... In this paper we study an algorithm for solving a minimization problem composed of a differentiable (possibly nonconvex) and a convex (possibly nondifferentiable) function. The algorithm iPiano combines forwardbackward splitting with an inertial force. It can be seen as a nonsmooth split versio ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
In this paper we study an algorithm for solving a minimization problem composed of a differentiable (possibly nonconvex) and a convex (possibly nondifferentiable) function. The algorithm iPiano combines forwardbackward splitting with an inertial force. It can be seen as a nonsmooth split version of the Heavyball method from Polyak. A rigorous analysis of the algorithm for the proposed class of problems yields global convergence of the function values and the arguments. This makes the algorithm robust for usage on nonconvex problems. The convergence result is obtained based on the Kurdyka Lojasiewicz inequality. This is a very weak restriction, which was used to prove convergence for several other gradient methods. First, an abstract convergence theorem for a generic algorithm is proved, and, then iPiano is shown to satisfy the requirements of this theorem. Furthermore, a convergence rate is established for the general problem class. We demonstrate iPiano on computer vision problems: image denoising with learned priors and diffusion based image compression. 1