Results 1  10
of
11
On Projection Algorithms for Solving Convex Feasibility Problems
, 1996
"... Due to their extraordinary utility and broad applicability in many areas of classical mathematics and modern physical sciences (most notably, computerized tomography), algorithms for solving convex feasibility problems continue to receive great attention. To unify, generalize, and review some of the ..."
Abstract

Cited by 142 (24 self)
 Add to MetaCart
Due to their extraordinary utility and broad applicability in many areas of classical mathematics and modern physical sciences (most notably, computerized tomography), algorithms for solving convex feasibility problems continue to receive great attention. To unify, generalize, and review some of these algorithms, a very broad and flexible framework is investigated . Several crucial new concepts which allow a systematic discussion of questions on behaviour in general Hilbert spaces and on the quality of convergence are brought out. Numerous examples are given. 1991 M.R. Subject Classification. Primary 47H09, 49M45, 6502, 65J05, 90C25; Secondary 26B25, 41A65, 46C99, 46N10, 47N10, 52A05, 52A41, 65F10, 65K05, 90C90, 92C55. Key words and phrases. Angle between two subspaces, averaged mapping, Cimmino's method, computerized tomography, convex feasibility problem, convex function, convex inequalities, convex programming, convex set, Fej'er monotone sequence, firmly nonexpansive mapping, H...
BundleBased Relaxation Methods For Multicommodity Capacitated Fixed Charge Network Design
, 1999
"... To efficiently derive bounds for largescale instances of the capacitated fixedcharge network design problem, Lagrangian relaxations appear promising. This paper presents the results of comprehensive experiments aimed at calibrating and comparing bundle and subgradient methods applied to the optimi ..."
Abstract

Cited by 44 (25 self)
 Add to MetaCart
To efficiently derive bounds for largescale instances of the capacitated fixedcharge network design problem, Lagrangian relaxations appear promising. This paper presents the results of comprehensive experiments aimed at calibrating and comparing bundle and subgradient methods applied to the optimization of Lagrangian duals arising from two Lagrangian relaxations. This study substantiates the fact that bundle methods appear superior to subgradient approaches because they converge faster and are more robust relative to different relaxations, problem characteristics, and selection of the initial parameter values. It also demonstrates that effective lower bounds may be computed efficiently for largescale instances of the capacitated fixedcharge network design problem. Indeed, in a fraction of the time required by a standard simplex approach to solve the linear programming relaxation, the methods we present attain very high quality solutions.
An Adaptive Level Set Method for Nondifferentiable Constrained Image Recovery
 IEEE TRANS. IMAGE PROCESSING
, 2002
"... The formulation of a wide variety of image recovery problems leads to the minimization of a convex objective over a convex set representing the constraints derived from a priori knowledge and consistency with the observed signals. In recent years, nondifferentiable objectives have become popular due ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
The formulation of a wide variety of image recovery problems leads to the minimization of a convex objective over a convex set representing the constraints derived from a priori knowledge and consistency with the observed signals. In recent years, nondifferentiable objectives have become popular due in part to their ability to capture certain features such as sharp edges. They also arise naturally in minimax inconsistent set theoretic recovery problems. At the same time, the issue of developing reliable numerical algorithms to solve such convex programs in the context of image recovery applications has received little attention. In this paper, we address this issue and propose an adaptive level set method for nondifferentiable constrained image recovery. The asymptotic properties of the method are analyzed and its implementation is discussed. Numerical experiments illustrate applications to total variation and minimax set theoretic image restoration and denoising problems.
Convergence of a Simple Subgradient Level Method
 Math. Programming
, 1998
"... We study the subgradient projection method for convex optimization with Brannlund 's level control for estimating the optimal value. We establish global convergence in objective values without additional assumptions employed in the literature. Key words. Nondifferentiable optimization, subgradient ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We study the subgradient projection method for convex optimization with Brannlund 's level control for estimating the optimal value. We establish global convergence in objective values without additional assumptions employed in the literature. Key words. Nondifferentiable optimization, subgradient optimization. 1 Introduction We consider a method for the minimization problem f = inf S f under the following assumptions. S is a nonempty closed convex set in IR n , f : IR n ! IR is a convex function, for each x 2 S we can compute f(x) and a subgradient g f (x) 2 @f(x) of f at x, and for each x 2 IR n we can find P S x = arg min y2S jx \Gamma yj, its orthogonal projection on S, where j \Delta j is the Euclidean norm. The optimal set Arg min S f may be empty. Given the kth iterate x k 2 S and a target level f k lev that estimates f , we may use H k = n x : f(x k ) + D g k ; x \Gamma x k E f k lev o with g k = g f (x k ) 2 @f(x k ) (1.1) to approximate t...
Fast and Low Complexity Blind Equalization via Subgradient Projections
, 2005
"... We propose a novel blind equalization method based on subgradient search over a convex cost surface. This is an alternative to the existing iterative blind equalization approaches such as the Constant Modulus Algorithm (CMA) which often suffer from the convergence problems caused by their nonconvex ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We propose a novel blind equalization method based on subgradient search over a convex cost surface. This is an alternative to the existing iterative blind equalization approaches such as the Constant Modulus Algorithm (CMA) which often suffer from the convergence problems caused by their nonconvex cost functions. The proposed method is an iterative algorithm, (called SubGradient based Blind Algorithm (SGBA) ) for both real and complex constellations, with a very simple update rule. It is based on the minimization of the l ∞ norm of the equalizer output under a linear constraint on the equalizer coefficients using subgradient iterations. The algorithm has a nice convergence behavior attributed to the convex l ∞ cost surface as well as the step size selection rules associated with the subgradient search. We illustrate the performance of the algorithm using examples with both complex and real constellations, where we show that the proposed algorithm’s convergence is less sensitive to initial point selection, and a fast convergence behavior can be achieved with a judicious selection of step sizes. Furthermore, the amount of data required for the training of the equalizer is significantly lower than most of the existing schemes.
An InfeasiblePoint Subgradient Method Using Adaptive Approximate Projections ⋆
"... Abstract. We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course of the algorit ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course of the algorithm). In particular, the iterates in our method can be infeasible throughout the whole procedure. Nevertheless, we provide conditions which ensure convergence to an optimal feasible point under suitable assumptions. One convergence result deals with step size sequences that are fixed a priori. Two other results handle dynamic Polyaktype step sizes depending on a lower or upper estimate of the optimal objective function value, respectively. Additionally, we briefly sketch two applications: Optimization with convex chance constraints, and finding the minimum ℓ1norm solution to an underdetermined linear system, an important problem in Compressed Sensing.
Dimitri Bertsekas (dimitrib@mit.edu)
 Stochastic Optimization: Algorithms and Applications
, 2000
"... We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to p ..."
Abstract
 Add to MetaCart
We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to perform the subgradient iteration incrementally, by sequentially taking steps along the subgradients of the component functions, with intermediate adjustment of the variables after processing each component function. This incremental approach has been very successful in solving large dierentiable least squares problems, such as those arising in the training of neural networks, and it has resulted in a much better practical rate of convergence than the steepest descent method. In this paper, we present convergence results and estimates of the convergence rate of a number of variants of incremental subgradient methods, including some that use randomization. The convergence rate estimates are c...
Documenta Math. 277 Subgradient Optimization in Nonsmooth Optimization (including the Soviet Revolution)
"... revolution 1 ..."
Continuous MinMax Programs with SemiInfinite Constraints
, 2011
"... Abstract⎯In this paper, we propose an algorithm for solving a kind of nonlinear programming where the objective is the maximal function of a family of continuous functions and the feasible domain is explicitly made of infinitely many constraints. Our algorithm combines the entropic regularization an ..."
Abstract
 Add to MetaCart
Abstract⎯In this paper, we propose an algorithm for solving a kind of nonlinear programming where the objective is the maximal function of a family of continuous functions and the feasible domain is explicitly made of infinitely many constraints. Our algorithm combines the entropic regularization and the cutting plane method (the Remeztype) to deal with the nondifferentiability of the maximal function and the infinitely many constraints respectively. A finite relaxed version, which terminates within a finite number of iterations to give an approximate solution, is proposed to handle the computational issues, including the blowup problem in the entropic regularization and the global optimization subproblems in the cutting plane method. To justify the efficiency of the inexact algorithm, we also analyze the theoretical errorbound and conduct numerical experiments.