Results 1  10
of
15
Monotonic Algorithms for Transmission Tomography
 IEEE Tr. Med. Im
, 1999
"... Abstract — We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible ..."
Abstract

Cited by 74 (30 self)
 Add to MetaCart
Abstract — We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible to find low curvature surrogate functions that guarantee monotonicity. Unlike previous methods, the proposed surrogate functions lead to monotonic algorithms even for the nonconvex loglikelihood that arises due to background events such as scatter and random coincidences. The gradient and the curvature of the likelihood terms are evaluated only once per iteration. Since the problem is simplified at each iteration, the CPU time is less than that of current algorithms which directly minimize the objective, yet the convergence rate is comparable. The simplicity, monotonicity and speed of the new algorithms are quite attractive. The convergence rates of the algorithms are demonstrated using real and simulated PET transmission scans.
ConjugateGradient Preconditioning Methods for ShiftVariant PET Image Reconstruction
 IEEE Tr. Im. Proc
, 2002
"... Gradientbased iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian mat ..."
Abstract

Cited by 51 (21 self)
 Add to MetaCart
Gradientbased iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian matrices in imaging problems. Circulant preconditioners can provide remarkable acceleration for inverse problems that are approximately shiftinvariant, i.e. for those with approximately blockToeplitz or blockcirculant Hessians. However, in applications with nonuniform noise variance, such as arises from Poisson statistics in emission tomography and in quantumlimited optical imaging, the Hessian of the weighted leastsquares objective function is quite shiftvariant, and circulant preconditioners perform poorly. Additional shiftvariance is caused by edgepreserving regularization methods based on nonquadratic penalty functions. This paper describes new preconditioners that approximate more accurately the Hessian matrices of shiftvariant imaging problems. Compared to diagonal or circulant preconditioning, the new preconditioners lead to significantly faster convergence rates for the unconstrained conjugategradient (CG) iteration. We also propose a new efficient method for the linesearch step required by CG methods. Applications to positron emission tomography (PET) illustrate the method.
Edgepreserving tomographic reconstruction with nonlocal regularization
 In Proc. IEEE Intl. Conf. on Image Processing
, 2002
"... Tomographic image reconstruction using statistical methods can provide more accurate system modeling, statistical models, and physical constraints than the conventional filtered backprojection (FBP) method. Because of the illposedness of the reconstruction problem, a roughness penalty is often impo ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
Tomographic image reconstruction using statistical methods can provide more accurate system modeling, statistical models, and physical constraints than the conventional filtered backprojection (FBP) method. Because of the illposedness of the reconstruction problem, a roughness penalty is often imposed on the solution to control noise. To avoid smoothing of edges, which are important image attributes, various edgepreserving regularization methods have been proposed. Most of these schemes rely on information from local neighborhoods to determine the presence of edges. In this paper, we propose a cost function that incorporates nonlocal boundary information into the regularization method. We use an alternating minimization algorithm with deterministic annealing to minimize the proposed cost function, jointly estimating region boundaries and object pixel values. We apply variational techniques implemented using levelsets methods to update the boundary estimates; then, using the most recent boundary estimate, we minimize a spacevariant quadratic cost function to update the image estimate. For the PET transmission reconstruction application, we compare the biasvariance tradeoff of this method with that of a “conventional” penalizedlikelihood algorithm with local Huber roughness penalty.
Accelerated Monotonic Algorithms for Transmission Tomography
 in Proc. IEEE Intl. Conf. on Image Processing
, 1998
"... We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible to find low ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible to find low curvature surrogate functions that guarantee monotonicity. Unlike previous methods, the proposed surrogate functions lead to monotonic algorithms even for the nonconvex log likelihood that arises due to background events such as scatter and random coincidences. The gradient and the curvature of the likelihood terms are evaluated only once per iteration. Since the problem is simplified, the CPU time per iteration is less than that of current algorithms which directly minimize the objective, yet the convergence rate is comparable. The simplicity, monotonicity and speed of the new algorithms are quite attractive. The convergence rates of the algorithms are demonstrated using real PET transmiss...
Image recovery using partitionedseparable paraboloidal surrogate coordinate ascent algorithms
 IEEE Trans. Image Process
, 2002
"... Abstract—Iterative coordinate ascent algorithms have been shown to be useful for image recovery, but are poorly suited to parallel computing due to their sequential nature. This paper presents a new fast converging parallelizable algorithm for image recovery that can be applied to a very broad class ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract—Iterative coordinate ascent algorithms have been shown to be useful for image recovery, but are poorly suited to parallel computing due to their sequential nature. This paper presents a new fast converging parallelizable algorithm for image recovery that can be applied to a very broad class of objective functions. This method is based on paraboloidal surrogate functions and a concavity technique. The paraboloidal surrogates simplify the optimization problem. The idea of the concavity technique is to partition pixels into subsets that can be updated in parallel to reduce the computation time. For fast convergence, pixels within each subset are updated sequentially using a coordinate ascent algorithm. The proposed algorithm is guaranteed to monotonically increase the objective function and intrinsically accommodates nonnegativity constraints. A global convergence proof is summarized. Simulation results show that the proposed algorithm requires less elapsed time for convergence than iterative coordinate ascent algorithms. With four parallel processors, the proposed algorithm yields a speedup factor of 3.77 relative to single processor coordinate ascent algorithms for a 3D confocal image restoration problem.
Statistical Image Reconstruction Algorithms Using Paraboloidal Surrogates for PET Transmission Scans
, 1999
"... Positron Emission Tomography (PET) is a diagnostic imaging tool that provides images of radioactive substances injected into the body to trace biological functions. The radioactive substance emits a positron which annihilates with an electron to produce two 511 keV photons traveling in approximately ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Positron Emission Tomography (PET) is a diagnostic imaging tool that provides images of radioactive substances injected into the body to trace biological functions. The radioactive substance emits a positron which annihilates with an electron to produce two 511 keV photons traveling in approximately opposite directions to be coincidentally detected by two detectors. Many photons are absorbed or scattered, reducing the number of detected emission events. Attenuation correction is crucial for quantitatively accurate PET reconstructions. PET transmission scans are performed to estimate attenuation parameters which are in turn used to correct the emission scans for attenuation effects. The noise in estimating the attenuation parameters propagates to the emission images affecting their quality and quantitative correctness. Thus, attenuation image reconstruction is extremely important in PET. Conventional methods of attenuation correction are suboptimal and ignore the Poisson nature of the data. We propose to use penalized likelihood image reconstruction techniques for transmission scans. Current algorithms for transmission tomography have two important problems: 1) they are not guaranteed to converge, 2) if they converge, the convergence is slow. We develop new fast and monotonic optimization algorithms for penalized likelihood image reconstruction based on a novel paraboloidal surrogates principle. We present results showing the speed of the new optimization algorithms as compared to previous ones. We apply the algorithms to PET data obtained from an anthropomorphic thorax phantom and real patient data. A transmission scan per...
Performance Comparison of Smoothing and Gamma Priors for Transmission Tomography
, 1999
"... We introduced [1] a Bayesian method for transmission tomography that used a novel pointwise prior in the form of a mixture of gamma distributions. Here, we compare the performance our pointwise prior with that of a smoothing prior in the context of a weak tumor detection task for emission PET. The r ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We introduced [1] a Bayesian method for transmission tomography that used a novel pointwise prior in the form of a mixture of gamma distributions. Here, we compare the performance our pointwise prior with that of a smoothing prior in the context of a weak tumor detection task for emission PET. The reprojected attenuation maps from each method are used for attenuation correction of 2D PET emission data. Our focus is on the problem of low counts in the transmission scan.
Fast Monotonic Algorithms for Transmission Tomography
 IEEE Tr. Med. Im
, 1998
"... We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible to find low ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We present a framework for designing fast and monotonic algorithms for transmission tomography penalizedlikelihood image reconstruction. The new algorithms are based on paraboloidal surrogate functions for the loglikelihood. Due to the form of the loglikelihood function, it is possible to find low curvature surrogate functions that guarantee monotonicity. Unlike previous methods, the proposed surrogate functions lead to monotonic algorithms even for the nonconvex log likelihood that arises due to background events such as scatter and random coincidences. The gradient and the curvature of the likelihood terms are evaluated only once per iteration. Since the problem is simplified at each iteration, the CPU time is less than that of current algorithms which directly minimize the objective, yet the convergence rate is comparable. The simplicity, monotonicity and speed of the new algorithms are quite attractive. The convergence rates of the algorithms are demonstrated using real and simu...
Convergent algorithms for statistical image reconstruction in emission tomography
 University of Michigan
, 2004
"... is dedicated to my parents and my wife. ii ACKNOWLEDGEMENTS I would like to express my deepest gratitude to my advisor Professor Jeff Fessler for his guidance, support, and kindness. It was my greatest luck that I had a chance to work with him. I would also like to thank Professors Alfred Hero, Robe ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
is dedicated to my parents and my wife. ii ACKNOWLEDGEMENTS I would like to express my deepest gratitude to my advisor Professor Jeff Fessler for his guidance, support, and kindness. It was my greatest luck that I had a chance to work with him. I would also like to thank Professors Alfred Hero, Robert Koeppe, David Neuhoff, and Thomas Nichols in my committee for their valuable input on my dissertation. I extend my gratitude to my past and current colleagues including Idris Elbakri, Matt Jacobson, Jeongtae Kim, Sangwoo Lee, Nan Sotthivirat, Somesh Srivastava, Web Stayman, Anastasia Yendiki, Rongping Zeng, and Yingying Zhang. I owe my thanks to many other friends including Hyoseok Lee. Finally, I would like to thank my parents for their love and encouragement. And I
Statistical methods for transmission image reconstruction with nonlocal edgepreserving regularization
 Univ. of Michigan
, 2000
"... ..."