Results 1 
7 of
7
On the Effectiveness of Projection Methods for Convex Feasibility Problems with Linear Inequality Constraints
"... The effectiveness of projection methods for solving systems of linear inequalities is investigated. It is shown that they often have a computational advantage over alternatives that have been proposed for solving the same problem and that this makes them successful in many realworld applications. ..."
Abstract

Cited by 33 (17 self)
 Add to MetaCart
(Show Context)
The effectiveness of projection methods for solving systems of linear inequalities is investigated. It is shown that they often have a computational advantage over alternatives that have been proposed for solving the same problem and that this makes them successful in many realworld applications. This is supported by experimental evidence provided in this paper on problems of various sizes (up to tens of thousands of unknowns satisfying up to hundreds of thousands of constraints) and by a discussion of the demonstrated efficacy of projection methods in numerous scientific publications and commercial patents (dealing with problems that can have over a billion unknowns and a similar number of constraints).
Projection methods: Swiss army knives for Solving feasibility and best approximation problems with halfspaces
, 2013
"... ar ..."
(Show Context)
Perturbationresilient blockiterative projection methods with application to image reconstruction from projections
, 2009
"... A blockiterative projection algorithm for solving the consistent convex feasibility problem in a finitedimensional Euclidean space that is resilient to bounded and summable perturbations (in the sense that convergence to a feasible point is retained even if such perturbations are introduced in eac ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
(Show Context)
A blockiterative projection algorithm for solving the consistent convex feasibility problem in a finitedimensional Euclidean space that is resilient to bounded and summable perturbations (in the sense that convergence to a feasible point is retained even if such perturbations are introduced in each iterative step of the algorithm) is proposed. This resilience can be used to steer the iterative process towards a feasible point that is superior in the sense of some functional on the points in the Euclidean space having a small value. The potential usefulness of this is illustrated in image reconstruction from projections, using both total variation and negative entropy as the functional. 1
Contemporary Mathematics Projection Methods: Swiss Army Knives for Solving Feasibility and Best Approximation Problems with Halfspaces
"... Abstract. We model a problem motivated by road design as a feasibility problem. Projections onto the constraint sets are obtained, and projection methods for solving the feasibility problem are studied. We present results of numerical experiments which demonstrate the efficacy of projection methods ..."
Abstract
 Add to MetaCart
Abstract. We model a problem motivated by road design as a feasibility problem. Projections onto the constraint sets are obtained, and projection methods for solving the feasibility problem are studied. We present results of numerical experiments which demonstrate the efficacy of projection methods even for challenging nonconvex problems. 1. Introduction and
Anderson Acceleration of the Alternating Projections Method for Computing the Nearest Correlation Matrix∗
, 2015
"... In a wide range of applications it is required to compute the nearest correlation matrix in the Frobenius norm to a given symmetric but indefinite matrix. Of the available methods with guaranteed convergence to the unique solution of this problem the easiest to implement, and perhaps the most widel ..."
Abstract
 Add to MetaCart
In a wide range of applications it is required to compute the nearest correlation matrix in the Frobenius norm to a given symmetric but indefinite matrix. Of the available methods with guaranteed convergence to the unique solution of this problem the easiest to implement, and perhaps the most widely used, is the alternating projections method. However, the rate of convergence of this method is at best linear, and it can require a large number of iterations to converge to within a given tolerance. We show that Anderson acceleration, a technique for accelerating the convergence of fixedpoint iterations, can be applied to the alternating projections method and that in practice it brings a significant reduction in both the number of iterations and the computation time. We also show that Anderson acceleration remains effective, and indeed can provide even greater improvements, when it is applied to the variants of the nearest correlation matrix problem in which specified elements are fixed or a lower bound is imposed on the smallest eigenvalue. Alternating projections is a general method for finding a point in the intersection of several sets and ours appears to be the first demonstration that this class of methods can benefit from Anderson acceleration.
Noname manuscript No. (will be inserted by the editor) Distance Majorization and Its Applications
"... Abstract The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algor ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton’s method such as the interior point method are viable for small to mediumscale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorizationminimization (MM) principle, the classical penalty method for constrained optimization, and quasiNewton acceleration of fixedpoint algorithms. The performance of our distance majorization algorithms is illustrated in several applications. Keywords constrained optimization · majorizationminimization (MM) · sequential unconstrained minimization · projection