Results 1  10
of
22
Projection and proximal point methods: convergence results and counterexamples
, 2003
"... Recently, Hundal has constructed a hyperplane H, a cone K, and a starting point y0 in `2 such that the sequence of alternating projections (PKPH)ny0 n∈N converges weakly to some point in H ∩K, but not in norm. We show how this construction results in a counterexample to norm convergence for iterates ..."
Abstract

Cited by 49 (19 self)
 Add to MetaCart
(Show Context)
Recently, Hundal has constructed a hyperplane H, a cone K, and a starting point y0 in `2 such that the sequence of alternating projections (PKPH)ny0 n∈N converges weakly to some point in H ∩K, but not in norm. We show how this construction results in a counterexample to norm convergence for iterates of averaged projections; hence, we give an affirmative answer to a question raised by Reich two decades ago. Furthermore, new counterexamples to norm convergence for iterates of firmly nonexpansive maps (a ̀ la Genel and Lindenstrauss) and for the proximal point algorithm (a ̀ la Güler) are provided. We also present a counterexample, along with some weak and norm convergence results, for the new framework of stringaveraging projection methods introduced by Censor, Elfving, and Herman. Extensions to Banach spaces and the situation for the Hilbert ball are discussed as well.
On the Effectiveness of Projection Methods for Convex Feasibility Problems with Linear Inequality Constraints
"... The effectiveness of projection methods for solving systems of linear inequalities is investigated. It is shown that they often have a computational advantage over alternatives that have been proposed for solving the same problem and that this makes them successful in many realworld applications. ..."
Abstract

Cited by 33 (17 self)
 Add to MetaCart
(Show Context)
The effectiveness of projection methods for solving systems of linear inequalities is investigated. It is shown that they often have a computational advantage over alternatives that have been proposed for solving the same problem and that this makes them successful in many realworld applications. This is supported by experimental evidence provided in this paper on problems of various sizes (up to tens of thousands of unknowns satisfying up to hundreds of thousands of constraints) and by a discussion of the demonstrated efficacy of projection methods in numerous scientific publications and commercial patents (dealing with problems that can have over a billion unknowns and a similar number of constraints).
Componentaveraged row projections: A robust, blockparallel scheme for sparse linear systems
 SIAM J. on Scientific Computing
"... Abstract. A new method for the parallel solution of large sparse linear systems is introduced. It proceeds by dividing the equations into blocks and operating in blockparallel iterative mode; i.e., all the blocks are processed in parallel, and the partial results are “merged ” to form the next iter ..."
Abstract

Cited by 33 (7 self)
 Add to MetaCart
(Show Context)
Abstract. A new method for the parallel solution of large sparse linear systems is introduced. It proceeds by dividing the equations into blocks and operating in blockparallel iterative mode; i.e., all the blocks are processed in parallel, and the partial results are “merged ” to form the next iterate. The new scheme performs Kaczmarz row projections within the blocks and merges the results by certain componentaveraging operations—hence it is called componentaveraged row projections, or CARP. The system matrix can be general, nonsymmetric, and illconditioned, and the division into blocks is unrestricted. For partial differential equations (PDEs), if the blocks are domainbased, then only variables at the boundaries between domains are averaged, thereby minimizing data transfer between processors. CARP is very robust; its application to test cases of linear systems derived from PDEs shows that it converges in difficult cases where stateoftheart methods fail. It is also very memory efficient and exhibits an almost linear speedup ratio, with efficiency greater than unity in some cases. A formal proof of convergence is presented: It is shown that the componentaveraging operations are equivalent to row projections in a certain superspace, so the convergence properties of CARP are identical to those of Kaczmarz’s algorithm in the superspace. CARP and its convergence proof also apply to the consistent convex feasibility problem.
Projection methods: Swiss army knives for Solving feasibility and best approximation problems with halfspaces
, 2013
"... ar ..."
(Show Context)
Perturbationresilient blockiterative projection methods with application to image reconstruction from projections
, 2009
"... A blockiterative projection algorithm for solving the consistent convex feasibility problem in a finitedimensional Euclidean space that is resilient to bounded and summable perturbations (in the sense that convergence to a feasible point is retained even if such perturbations are introduced in eac ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
(Show Context)
A blockiterative projection algorithm for solving the consistent convex feasibility problem in a finitedimensional Euclidean space that is resilient to bounded and summable perturbations (in the sense that convergence to a feasible point is retained even if such perturbations are introduced in each iterative step of the algorithm) is proposed. This resilience can be used to steer the iterative process towards a feasible point that is superior in the sense of some functional on the points in the Euclidean space having a small value. The potential usefulness of this is illustrated in image reconstruction from projections, using both total variation and negative entropy as the functional. 1
A.: On the string averaging method for sparse common fixed points problems
 Int. Trans. Oper. Res
, 2009
"... We study the common fixed points problem for the class of directed operators. This class is important because many commonly used nonlinear operators in convex optimization belong to it. We propose a definition of sparseness of a family of operators and investigate a stringaveraging algorithmic sche ..."
Abstract

Cited by 18 (14 self)
 Add to MetaCart
(Show Context)
We study the common fixed points problem for the class of directed operators. This class is important because many commonly used nonlinear operators in convex optimization belong to it. We propose a definition of sparseness of a family of operators and investigate a stringaveraging algorithmic scheme that favorably handles the common fixed points problem when the family of operators is sparse. The convex feasibility problem is treated as a special case and a new subgradient projections algorithmic scheme is obtained. 1
G.T.: Perturbation resilience and superiorization of iterative algorithms
 Inverse Problems
, 2010
"... Abstract. Iterative algorithms aimed at solving some problems are discussed. For certain problems, such as finding a common point in the intersection of a finite number of convex sets, there often exist iterative algorithms that impose very little demand on computer resources. For other problems, su ..."
Abstract

Cited by 17 (13 self)
 Add to MetaCart
(Show Context)
Abstract. Iterative algorithms aimed at solving some problems are discussed. For certain problems, such as finding a common point in the intersection of a finite number of convex sets, there often exist iterative algorithms that impose very little demand on computer resources. For other problems, such as finding that point in the intersection at which the value of a given function is optimal, algorithms tend to need more computer memory and longer execution time. A methodology is presented whose aim is to produce automatically for an iterative algorithm of the first kind a “superiorized version ” of it that retains its computational efficiency but nevertheless goes a long way towards solving an optimization problem. This is possible to do if the original algorithm is “perturbation resilient, ” which is shown to be the case for various projection algorithms for solving the consistent convex feasibility problem. The superiorized versions of such algorithms use perturbations that steer the process in the direction of a superior feasible point, which is not necessarily optimal, with respect to the given function. After presenting these intuitive ideas in a precise mathematical form, they are illustrated in image reconstruction from projections for two different projection algorithms superiorized for the function whose value is the total variation of the image.
On stringaveraging for sparse problems and on the split common …xed point problem, Contemporary Mathematics 513
, 2010
"... We review the common
xed point problem for the class of directed operators. This class is important because many commonly used nonlinear operators in convex optimization belong to it. We present our recent de
nition of sparseness of a family of operators and discuss a stringaveraging algorithmic ..."
Abstract

Cited by 13 (10 self)
 Add to MetaCart
We review the common
xed point problem for the class of directed operators. This class is important because many commonly used nonlinear operators in convex optimization belong to it. We present our recent de
nition of sparseness of a family of operators and discuss a stringaveraging algorithmic scheme that favorably handles the common xed points problem when the family of operators is sparse. We also review some recent results on the multiple operators split common xed point problem which requires to
nd a common
xed point of a family of operators in one space whose image under a linear transformation is a common
xed point of another family of operators in the image space. 1
Convergence and perturbation resilience of dynamic stringaveraging projection methods
 Computational Optimization and Applications
, 2013
"... We consider the convex feasibility problem (CFP) in Hilbert space and concentrate on the study of stringaveraging projection (SAP) methods for the CFP, analyzing their convergence and their perturbation resilience. In the past, SAP methods were formulated with a single predetermined set of strings ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
We consider the convex feasibility problem (CFP) in Hilbert space and concentrate on the study of stringaveraging projection (SAP) methods for the CFP, analyzing their convergence and their perturbation resilience. In the past, SAP methods were formulated with a single predetermined set of strings and a single predetermined set of weights. Here we extend the scope of the family of SAP methods to allow iterationindexdependent variable strings and weights and term such methods dynamic stringaveraging projection (DSAP) methods. The bounded perturbation resilience of DSAP methods is relevant and important for their possible use in the framework of the recently developed superiorization heuristic methodology for constrained minimization problems. 1
Sparse stringaveraging and split common fixed points
, 2008
"... We review the common fixed point problem for the class of directed operators. This class is important because many commonly used nonlinear operators in convex optimization belong to it. We present our recent definition of sparseness of a family of operators and discuss a stringaveraging algorithmi ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
We review the common fixed point problem for the class of directed operators. This class is important because many commonly used nonlinear operators in convex optimization belong to it. We present our recent definition of sparseness of a family of operators and discuss a stringaveraging algorithmic scheme that favorably handles the common fixed points problem when the family of operators is sparse. We also review some recent results on the multiple operators split common fixed point problem which requires to find a common fixed point of a family of operators in one space whose image under a linear transformation is a common fixed point of another family of operators in the image space. 1