Results 1  10
of
285
Penalized MaximumLikelihood Image Reconstruction using SpaceAlternating Generalized EM Algorithms
 IEEE Tr. Im. Proc
, 1995
"... Most expectationmaximization (EM) type algorithms for penalized maximumlikelihood image reconstruction converge slowly, particularly when one incorporates additive background effects such as scatter, random coincidences, dark current, or cosmic radiation. In addition, regularizing smoothness penal ..."
Abstract

Cited by 102 (32 self)
 Add to MetaCart
(Show Context)
Most expectationmaximization (EM) type algorithms for penalized maximumlikelihood image reconstruction converge slowly, particularly when one incorporates additive background effects such as scatter, random coincidences, dark current, or cosmic radiation. In addition, regularizing smoothness penalties (or priors) introduce parameter coupling, rendering intractable the Msteps of most EMtype algorithms. This paper presents spacealternating generalized EM (SAGE) algorithms for image reconstruction, which update the parameters sequentially using a sequence of small "hidden" data spaces, rather than simultaneously using one large completedata space. The sequential update decouples the Mstep, so the maximization can typically be performed analytically. We introduce new hiddendata spaces that are less informative than the conventional completedata space for Poisson data and that yield significant improvements in convergence rate. This acceleration is due to statistical considerations, not numerical overrelaxation methods, so monotonic increases in the objective function are guaranteed. We provide a general global convergence proof for SAGE methods with nonnegativity constraints.
Accelerating popular tomographic reconstruction algorithms on commodity PC graphics hardware
 IEEE Trans. On Nuclear Science
, 2005
"... Abstract—The task of reconstructing an object from its projections via tomographic methods is a timeconsuming process due to the vast complexity of the data. For this reason, manufacturers of equipment for medical computed tomography (CT) rely mostly on special application specified integrated circ ..."
Abstract

Cited by 55 (12 self)
 Add to MetaCart
(Show Context)
Abstract—The task of reconstructing an object from its projections via tomographic methods is a timeconsuming process due to the vast complexity of the data. For this reason, manufacturers of equipment for medical computed tomography (CT) rely mostly on special application specified integrated circuits (ASICs) to obtain the fast reconstruction times required in clinical settings. Although modern CPUs have gained sufficient power in recent years to be competitive for twodimensional (2D) reconstruction, this is not the case for threedimensional (3D) reconstructions, especially not when iterative algorithms must be applied. The recent evolution of commodity PC computer graphics boards (GPUs) has the potential to change this picture in a very dramatic way. In this paper we will show how the new floating point GPUs can be exploited to perform both analytical and iterative reconstruction from Xray and functional imaging data. For this purpose, we decompose three popular threedimensional (3D) reconstruction algorithms (Feldkamp filtered backprojection, the simultaneous algebraic reconstruction technique, and expectation maximization) into a common set of base modules, which all can be executed on the GPU and their output linked internally. Visualization of the reconstructed object is easily achieved since the object already resides in the graphics hardware, allowing one to run a visualization module at any time to view the reconstruction results. Our implementation allows speedups of over an order of magnitude with respect to CPU implementations, at comparable image quality. Index Terms—Graphics hardware, image reconstruction, tomography. I.
Statistical image reconstruction for polyenergetic Xray computed tomography
 IEEE Transactions on Medical Imaging
, 2002
"... Abstract—This paper describes a statistical image reconstruction method for Xray computed tomography (CT) that is based on a physical model that accounts for the polyenergetic Xray source spectrum and the measurement nonlinearities caused by energydependent attenuation. We assume that the object c ..."
Abstract

Cited by 54 (12 self)
 Add to MetaCart
(Show Context)
Abstract—This paper describes a statistical image reconstruction method for Xray computed tomography (CT) that is based on a physical model that accounts for the polyenergetic Xray source spectrum and the measurement nonlinearities caused by energydependent attenuation. We assume that the object consists of a given number of nonoverlapping materials, such as soft tissue and bone. The attenuation coefficient of each voxel is the product of its unknown density and a known energydependent mass attenuation coefficient. We formulate a penalizedlikelihood function for this polyenergetic model and develop an orderedsubsets iterative algorithm for estimating the unknown densities in each voxel. The algorithm monotonically decreases the cost function at each iteration when one subset is used. Applying this method to simulated Xray CT measurements of objects containing both bone and soft tissue yields images with significantly reduced beam hardening artifacts. Index Terms—Beam hardening, penalized likelihood, statistical reconstruction, Xray CT. I.
A Statistical Multiscale Framework for Poisson Inverse Problems
, 2000
"... This paper describes a statistical modeling and analysis method for linear inverse problems involving Poisson data based on a novel multiscale framework. The framework itself is founded upon a multiscale analysis associated with recursive partitioning of the underlying intensity, a corresponding ..."
Abstract

Cited by 53 (4 self)
 Add to MetaCart
This paper describes a statistical modeling and analysis method for linear inverse problems involving Poisson data based on a novel multiscale framework. The framework itself is founded upon a multiscale analysis associated with recursive partitioning of the underlying intensity, a corresponding multiscale factorization of the likelihood (induced by this analysis), and a choice of prior probability distribution made to match this factorization by modeling the \splits" in the underlying partition. The class of priors used here has the interesting feature that the \noninformative" member yields the traditional maximum likelihood solution; other choices are made to reect prior belief as to the smoothness of the unknown intensity. Adopting the expectationmaximization (EM) algorithm for use in computing the MAP estimate corresponding to our model, we nd that our model permits remarkably simple, closedform expressions for the EM update equations. The behavior of our EM algorit...
The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
 SIAM J. Optim
, 2001
"... Abstract. We describe an optimization problem arising in reconstructing 3D medical images from Positron Emission Tomography (PET). A mathematical model of the problem, based on the Maximum Likelihood principle is posed as a problem of minimizing a convex function of several millions variables over t ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We describe an optimization problem arising in reconstructing 3D medical images from Positron Emission Tomography (PET). A mathematical model of the problem, based on the Maximum Likelihood principle is posed as a problem of minimizing a convex function of several millions variables over the standard simplex. To solve a problem of these characteristics, we develop and implement a new algorithm, Ordered Subsets Mirror Descent, and demonstrate, theoretically and computationally, that it is well suited for solving the PET reconstruction problem. Key words: positron emission tomography, maximum likelihood, image reconstruction, convex optimization, mirror descent. 1
A Theoretical Study of the Contrast Recovery and Variance of MAP Reconstructions From PET Data
 IEEE Trans. Med. Imag
, 1999
"... We examine the spatial resolution and variance properties of PET images reconstructed using maximum a posteriori (MAP) or penalizedlikelihood methods. Resolution is characterized by the contrast recovery coefficient (CRC) of the local impulse response. Simplified approximate expressions are derived ..."
Abstract

Cited by 44 (8 self)
 Add to MetaCart
(Show Context)
We examine the spatial resolution and variance properties of PET images reconstructed using maximum a posteriori (MAP) or penalizedlikelihood methods. Resolution is characterized by the contrast recovery coefficient (CRC) of the local impulse response. Simplified approximate expressions are derived for the local impulse response CRCs and variances for each voxel. Using these results we propose a practical scheme for selecting spatially variant smoothing parameters to optimize lesion detectability through maximization of the local CRCtonoise ratio in the reconstructed image. I. INTRODUCTION PET image reconstruction algorithms based on maximum likelihood (ML) or maximum a posteriori (MAP) principles can produce improved spatial resolution and noise properties in comparison to conventional filtered backprojection (FBP) methods. It is often important to be able to quantify this improvement in terms of the resolution (or bias) and variance of the resulting images. These measures can be...
A penalizedlikelihood image reconstruction method for emission tomography, compared to postsmoothed maximumlikelihood with matched spatial resolution
 IEEE Trans Med Imaging
"... Abstract—Regularization is desirable for image reconstruction in emission tomography. A powerful regularization method is the penalizedlikelihood (PL) reconstruction algorithm (or equivalently, maximum a posteriori reconstruction), where the sum of the likelihood and a noise suppressing penalty te ..."
Abstract

Cited by 39 (14 self)
 Add to MetaCart
(Show Context)
Abstract—Regularization is desirable for image reconstruction in emission tomography. A powerful regularization method is the penalizedlikelihood (PL) reconstruction algorithm (or equivalently, maximum a posteriori reconstruction), where the sum of the likelihood and a noise suppressing penalty term (or Bayesian prior) is optimized. Usually, this approach yields positiondependent resolution and bias. However, for some applications in emission tomography, a shiftinvariant point spread function would be advantageous. Recently, a new method has been proposed, in which the penalty term is tuned in every pixel to impose a uniform local impulse response. In this paper, an alternative way to tune the penalty term is presented. We performed positron emission tomography and single photon emission computed tomography simulations to compare the performance of the new method to that of the postsmoothed maximumlikelihood (ML) approach, using the impulse response of the former method as the postsmoothing filter for the latter. For this experiment, the noise properties of the PL algorithm were not superior to those of postsmoothed ML reconstruction. Index Terms—Bayesian reconstruction, PET, regularization, SPECT, tomography.
Statistical approaches in quantitative positron emission tomography
 Statistics and Computing
"... Positron emission tomography is a medical imaging modality for producing 3D images of the spatial distribution of biochemical tracers within the human body. The images are reconstructed from data formed through detection of radiation resulting from the emission of positrons from radioisotopes tagged ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
(Show Context)
Positron emission tomography is a medical imaging modality for producing 3D images of the spatial distribution of biochemical tracers within the human body. The images are reconstructed from data formed through detection of radiation resulting from the emission of positrons from radioisotopes tagged onto the tracer of interest. These measurements are approximate line integrals from which the image can be reconstructed using analytical inversion formulae. However these direct methods do not allow accurate modeling either of the detector system or of the inherent statistical fluctuations in the data. Here we review recent progress in developing statistical approaches to image estimation that can overcome these limitations. We describe the various components of the physical model and review different formulations of the inverse problem. The wide range of numerical procedures for solving these problems are then reviewed. Finally, we describe recent work aimed at quantifying the quality of the resulting images, both in terms of classical measures of estimator bias and variance, and also using measures that are of more direct clinical relevance.
Convergent incremental optimization transfer algorithms: Application to tomography
 IEEE Trans. Med. Imag., Submitted
"... Abstract—No convergent ordered subsets (OS) type image reconstruction algorithms for transmission tomography have been proposed to date. In contrast, in emission tomography, there are two known families of convergent OS algorithms: methods that use relaxation parameters (Ahn and Fessler, 2003), and ..."
Abstract

Cited by 37 (15 self)
 Add to MetaCart
(Show Context)
Abstract—No convergent ordered subsets (OS) type image reconstruction algorithms for transmission tomography have been proposed to date. In contrast, in emission tomography, there are two known families of convergent OS algorithms: methods that use relaxation parameters (Ahn and Fessler, 2003), and methods based on the incremental expectation maximization (EM) approach (Hsiao et al., 2002). This paper generalizes the incremental EM approach by introducing a general framework that we call “incremental optimization transfer. ” Like incremental EM methods, the proposed algorithms accelerate convergence speeds and ensure global convergence (to a stationary point) under mild regularity conditions without requiring inconvenient relaxation parameters. The general optimization transfer framework enables the use of a very broad family of nonEM surrogate functions. In particular, this paper provides the first convergent OStype algorithm for transmission tomography. The general approach is applicable to both monoenergetic and polyenergetic transmission scans as well as to other image reconstruction problems. We propose a particular incremental optimization transfer method for (nonconcave) penalizedlikelihood (PL) transmission image reconstruction by using separable paraboloidal surrogates (SPS). Results show that the new “transmission incremental optimization transfer (TRIOT) ” algorithm is faster than nonincremental ordinary SPS and even OSSPS yet is convergent. I.
Globally convergent image reconstruction for emission tomography using relaxed ordered subsets algorithms
 IEEE TRANS. MED. IMAG
, 2003
"... We present two types of globally convergent relaxed ordered subsets (OS) algorithms for penalizedlikelihood image reconstruction in emission tomography: modified block sequential regularized expectationmaximization (BSREM) and relaxed OS separable paraboloidal surrogates (OSSPS). The global conv ..."
Abstract

Cited by 34 (16 self)
 Add to MetaCart
We present two types of globally convergent relaxed ordered subsets (OS) algorithms for penalizedlikelihood image reconstruction in emission tomography: modified block sequential regularized expectationmaximization (BSREM) and relaxed OS separable paraboloidal surrogates (OSSPS). The global convergence proof of the existing BSREM (De Pierro and Yamagishi, 2001) required a few a posteriori assumptions. By modifying the scaling functions of BSREM, we are able to prove the convergence of the modified BSREM under realistic assumptions. Our modification also makes stepsize selection more convenient. In addition, we introduce relaxation into the OSSPS algorithm (Erdoˇgan and Fessler, 1999) that otherwise would converge to a limit cycle. We prove the global convergence of diagonally scaled incremental gradient methods of which the relaxed OSSPS is a special case; main results of the proofs are from (Nedić and Bertsekas, 2001) and (Correa and Lemaréchal, 1993). Simulation results showed that both new algorithms achieve global convergence yet retain the fast initial convergence speed of conventional unrelaxed ordered subsets algorithms.