Results 1  10
of
115
Deterministic edgepreserving regularization in computed imaging
 IEEE Trans. Image Processing
, 1997
"... Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such ..."
Abstract

Cited by 312 (27 self)
 Add to MetaCart
(Show Context)
Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such an edgepreserving regularization. Under these conditions, we show that it is possible to introduce an auxiliary variable whose role is twofold. First, it marks the discontinuities and ensures their preservation from smoothing. Second, it makes the criterion halfquadratic. The optimization is then easier. We propose a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable. This leads to the definition of an original reconstruction algorithm, called ARTUR. Some theoretical properties of ARTUR are discussed. Experimental results illustrate the behavior of the algorithm. These results are shown in the field of tomography, but this method can be applied in a large number of applications in image processing. I.
SpaceAlternating Generalized ExpectationMaximization Algorithm
 IEEE Trans. Signal Processing
, 1994
"... The expectationmaximization (EM) method can facilitate maximizing likelihood functions that arise in statistical estimation problems. In the classical EM paradigm, one iteratively maximizes the conditional loglikelihood of a single unobservable complete data space, rather than maximizing the intra ..."
Abstract

Cited by 194 (28 self)
 Add to MetaCart
(Show Context)
The expectationmaximization (EM) method can facilitate maximizing likelihood functions that arise in statistical estimation problems. In the classical EM paradigm, one iteratively maximizes the conditional loglikelihood of a single unobservable complete data space, rather than maximizing the intractable likelihood function for the measured or incomplete data. EM algorithms update all parameters simultaneously, which has two drawbacks: 1) slow convergence, and 2) difficult maximization steps due to coupling when smoothness penalties are used. This paper describes the spacealternating generalized EM (SAGE) method, which updates the parameters sequentially by alternating between several small hiddendata spaces defined by the algorithm designer. We prove that the sequence of estimates monotonically increases the penalizedlikelihood objective, we derive asymptotic convergence rates, and we provide sufficient conditions for monotone convergence in norm. Two signal processing applicatio...
A unified approach to statistical tomography using coordinate descent optimization
 IEEE Trans. on Image Processing
, 1996
"... Abstract 1 Over the past ten years there has been considerable interest in statistically optimal reconstruction of image crosssections from tomographic data. In particular, a variety of such algorithms have been proposed for maximum a posteriori (MAP) reconstruction from emission tomographic data. ..."
Abstract

Cited by 140 (27 self)
 Add to MetaCart
(Show Context)
Abstract 1 Over the past ten years there has been considerable interest in statistically optimal reconstruction of image crosssections from tomographic data. In particular, a variety of such algorithms have been proposed for maximum a posteriori (MAP) reconstruction from emission tomographic data. While MAP estimation requires the solution of an optimization problem, most existing reconstruction algorithms take an indirect approach based on the expectation maximization (EM) algorithm. In this paper we propose a new approach to statistically optimal image reconstruction based on direct optimization of the MAP criterion. The key to this direct optimization approach is greedy pixelwise computations known as iterative coordinate decent (ICD). We show that the ICD iterations require approximately the same amount of computation per iteration as EM based approaches, but the new method converges much more rapidly (in our experiments typically 5 iterations). Other advantages of the ICD method are that it is easily applied to MAP estimation of transmission tomograms, and typical convex constraints, such as positivity, are simply incorporated.
Penalized MaximumLikelihood Image Reconstruction using SpaceAlternating Generalized EM Algorithms
 IEEE Tr. Im. Proc
, 1995
"... Most expectationmaximization (EM) type algorithms for penalized maximumlikelihood image reconstruction converge slowly, particularly when one incorporates additive background effects such as scatter, random coincidences, dark current, or cosmic radiation. In addition, regularizing smoothness penal ..."
Abstract

Cited by 102 (32 self)
 Add to MetaCart
Most expectationmaximization (EM) type algorithms for penalized maximumlikelihood image reconstruction converge slowly, particularly when one incorporates additive background effects such as scatter, random coincidences, dark current, or cosmic radiation. In addition, regularizing smoothness penalties (or priors) introduce parameter coupling, rendering intractable the Msteps of most EMtype algorithms. This paper presents spacealternating generalized EM (SAGE) algorithms for image reconstruction, which update the parameters sequentially using a sequence of small "hidden" data spaces, rather than simultaneously using one large completedata space. The sequential update decouples the Mstep, so the maximization can typically be performed analytically. We introduce new hiddendata spaces that are less informative than the conventional completedata space for Poisson data and that yield significant improvements in convergence rate. This acceleration is due to statistical considerations, not numerical overrelaxation methods, so monotonic increases in the objective function are guaranteed. We provide a general global convergence proof for SAGE methods with nonnegativity constraints.
Monotonic Algorithms for Transmission Tomography
 IEEE Tr. Med. Im
, 1999
"... Abstract — We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible ..."
Abstract

Cited by 92 (39 self)
 Add to MetaCart
(Show Context)
Abstract — We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible to find low curvature surrogate functions that guarantee monotonicity. Unlike previous methods, the proposed surrogate functions lead to monotonic algorithms even for the nonconvex loglikelihood that arises due to background events such as scatter and random coincidences. The gradient and the curvature of the likelihood terms are evaluated only once per iteration. Since the problem is simplified at each iteration, the CPU time is less than that of current algorithms which directly minimize the objective, yet the convergence rate is comparable. The simplicity, monotonicity and speed of the new algorithms are quite attractive. The convergence rates of the algorithms are demonstrated using real and simulated PET transmission scans.
ConjugateGradient Preconditioning Methods for ShiftVariant PET Image Reconstruction
 IEEE Tr. Im. Proc
, 2002
"... Gradientbased iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian mat ..."
Abstract

Cited by 76 (31 self)
 Add to MetaCart
(Show Context)
Gradientbased iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian matrices in imaging problems. Circulant preconditioners can provide remarkable acceleration for inverse problems that are approximately shiftinvariant, i.e. for those with approximately blockToeplitz or blockcirculant Hessians. However, in applications with nonuniform noise variance, such as arises from Poisson statistics in emission tomography and in quantumlimited optical imaging, the Hessian of the weighted leastsquares objective function is quite shiftvariant, and circulant preconditioners perform poorly. Additional shiftvariance is caused by edgepreserving regularization methods based on nonquadratic penalty functions. This paper describes new preconditioners that approximate more accurately the Hessian matrices of shiftvariant imaging problems. Compared to diagonal or circulant preconditioning, the new preconditioners lead to significantly faster convergence rates for the unconstrained conjugategradient (CG) iteration. We also propose a new efficient method for the linesearch step required by CG methods. Applications to positron emission tomography (PET) illustrate the method.
GroupedCoordinate Ascent Algorithms for PenalizedLikelihood Transmission Image Reconstruction
 IEEE Tr. Med. Im
, 1996
"... This paper presents a new class of algorithms for penalizedlikelihood reconstruction of attenuation maps from lowcount transmission scans. We derive the algorithms by applying to the transmission loglikelihood a version of the convexity technique developed by De Pierro for emission tomography. The ..."
Abstract

Cited by 68 (27 self)
 Add to MetaCart
(Show Context)
This paper presents a new class of algorithms for penalizedlikelihood reconstruction of attenuation maps from lowcount transmission scans. We derive the algorithms by applying to the transmission loglikelihood a version of the convexity technique developed by De Pierro for emission tomography. The new class includes the singlecoordinate ascent (SCA) algorithmand Lange's convex algorithm for transmission tomography as special cases. The new groupedcoordinate ascent (GCA) algorithms in the class overcome several limitations associated with previous algorithms. (1) Fewer exponentiations are required than in the transmission MLEM algorithm or in the SCA algorithm. (2) The algorithms intrinsically accommodate nonnegativity constraints, unlike many gradientbased methods. (3) The algorithms are easily parallelizable, unlike the SCA algorithm and perhaps linesearch algorithms. We show that the GCA algorithms converge faster than the SCA algorithm, even on conventional workstations. An ex...
Exploring estimator biasvariance tradeoffs using the uniform CR bound
 IEEE Trans. on Sig. Proc
, 1996
"... We introduce a plane, which we call the deltasigma plane, that is indexed by the norm of the estimator bias gradient and the variance of the estimator. The norm of the bias gradient is related to the maximum variation in the estimator bias function over a neighborhood of parameter space. Using a un ..."
Abstract

Cited by 59 (19 self)
 Add to MetaCart
(Show Context)
We introduce a plane, which we call the deltasigma plane, that is indexed by the norm of the estimator bias gradient and the variance of the estimator. The norm of the bias gradient is related to the maximum variation in the estimator bias function over a neighborhood of parameter space. Using a uniform CramerRao (CR) bound on estimator variance a deltasigma tradeoff curve is specied which denes an "unachievable region" of the deltasigma plane for a specified statistical model. In order to place an estimator on this plane for comparison to the deltasigma tradeoff curve, the estimator variance, bias gradient, and bias gradient norm must be evaluated. We present a simple and accurate method for experimentally determining the bias gradient norm based on applying a bootstrap estimator to a sample mean constructed from the gradient of the loglikelihood. We demonstrate the methods developed in this paper for linear Gaussian and nonlinear Poisson inverse problems.
Hybrid Poisson/Polynomial Objective Functions for Tomographic Image Reconstruction from Transmission Scans
 IEEE Tr. Im. Proc
, 1995
"... This paper describes rapidly converging algorithms for computing attenuation maps from Poisson transmission measurements using penalizedlikelihood objective functions. We demonstrate that an underrelaxed cyclic coordinateascent algorithm converges faster than the convex algorithm of Lange [1], wh ..."
Abstract

Cited by 57 (25 self)
 Add to MetaCart
(Show Context)
This paper describes rapidly converging algorithms for computing attenuation maps from Poisson transmission measurements using penalizedlikelihood objective functions. We demonstrate that an underrelaxed cyclic coordinateascent algorithm converges faster than the convex algorithm of Lange [1], which in turn converges faster than the expectationmaximization (EM) algorithm for transmission tomography [1]. To further reduce computation, one could replace the loglikelihood objective with a quadratic approximation. However, we show with simulations and analysis that the quadratic objective function leads to biased estimates for lowcount measurements. Therefore we introduce hybrid Poisson/polynomial objective functions that use the exact Poisson loglikelihood for detector measurements with low counts, but use computationally efficient quadratic or cubic approximations for the highcount detector measurements. We demonstrate that the hybrid objective functions reduce computation time w...
A Theoretical Study of the Contrast Recovery and Variance of MAP Reconstructions From PET Data
 IEEE Trans. Med. Imag
, 1999
"... We examine the spatial resolution and variance properties of PET images reconstructed using maximum a posteriori (MAP) or penalizedlikelihood methods. Resolution is characterized by the contrast recovery coefficient (CRC) of the local impulse response. Simplified approximate expressions are derived ..."
Abstract

Cited by 44 (8 self)
 Add to MetaCart
(Show Context)
We examine the spatial resolution and variance properties of PET images reconstructed using maximum a posteriori (MAP) or penalizedlikelihood methods. Resolution is characterized by the contrast recovery coefficient (CRC) of the local impulse response. Simplified approximate expressions are derived for the local impulse response CRCs and variances for each voxel. Using these results we propose a practical scheme for selecting spatially variant smoothing parameters to optimize lesion detectability through maximization of the local CRCtonoise ratio in the reconstructed image. I. INTRODUCTION PET image reconstruction algorithms based on maximum likelihood (ML) or maximum a posteriori (MAP) principles can produce improved spatial resolution and noise properties in comparison to conventional filtered backprojection (FBP) methods. It is often important to be able to quantify this improvement in terms of the resolution (or bias) and variance of the resulting images. These measures can be...