Results 1 
2 of
2
Practical Aspects of the MoreauYosida Regularization I: Theoretical Properties
, 1994
"... When computing the infimal convolution of a convex function f with the squared norm, one obtains the socalled MoreauYosida regularization of f . Among other things, this function has a Lipschitzian gradient. We investigate some more of its properties, relevant for optimization. Our main result co ..."
Abstract

Cited by 49 (2 self)
 Add to MetaCart
When computing the infimal convolution of a convex function f with the squared norm, one obtains the socalled MoreauYosida regularization of f . Among other things, this function has a Lipschitzian gradient. We investigate some more of its properties, relevant for optimization. Our main result concerns secondorder differentiability and is as follows. Under assumptions that are quite reasonable in optimization, the MoreauYosida is twice diffferentiable if and only if f is twice differentiable as well. In the course of our development, we give some results of general interest in convex analysis. In particular, we establish primaldual relationship between the remainder terms in the firstorder development of a convex function and its conjugate.
A Hybrid Approximate ExtragradientProximal Point Algorithm Using The Enlargement Of A Maximal Monotone Operator
, 1999
"... We propose a modification of the classical extragradient and proximal point algorithms for finding a zero of a maximal monotone operator in a Hilbert space. At each iteration of the method, an approximate extragradienttype step is performed using information obtained from an approximate solution of ..."
Abstract

Cited by 23 (14 self)
 Add to MetaCart
We propose a modification of the classical extragradient and proximal point algorithms for finding a zero of a maximal monotone operator in a Hilbert space. At each iteration of the method, an approximate extragradienttype step is performed using information obtained from an approximate solution of a proximal point subproblem. The algorithm is of a hybrid type, as it combines steps of the extragradient and proximal methods. Furthermore, the algorithm uses elements in the enlargement (proposed by Burachik, Iusem and Svaiter [2]) of the operator defining the problem. One of the important features of our approach is that it allows significant relaxation of tolerance requirements imposed on the solution of proximal point subproblems. This yields a more practical proximalalgorithmbased framework. Weak global convergence and local linear rate of convergence are established under suitable assumptions. It is further demonstrated that the modified forwardbackward splitting algorithm of Tseng [35]...