## Convergent incremental optimization transfer algorithms: Application to tomography

### Cached

### Download Links

- [www.eecs.umich.edu]
- [www.eecs.umich.edu]
- [www-rcf.usc.edu]
- [web.eecs.umich.edu]
- [www-rcf.usc.edu]
- [www.eecs.umich.edu]
- [web.eecs.umich.edu]
- [web.eecs.umich.edu]
- [www.eecs.umich.edu]
- [web.eecs.umich.edu]
- [www.eecs.umich.edu]
- [web.eecs.umich.edu]
- [web.eecs.umich.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | IEEE Trans. Med. Imag., Submitted |

Citations: | 24 - 10 self |

### BibTeX

@ARTICLE{Ahn_convergentincremental,

author = {Sangtae Ahn and Jeffrey A. Fessler and Doron Blatt and Alfred O. Hero},

title = {Convergent incremental optimization transfer algorithms: Application to tomography},

journal = {IEEE Trans. Med. Imag., Submitted},

year = {}

}

### Years of Citing Articles

### OpenURL

### Abstract

Abstract—No convergent ordered subsets (OS) type image reconstruction algorithms for transmission tomography have been proposed to date. In contrast, in emission tomography, there are two known families of convergent OS algorithms: methods that use relaxation parameters (Ahn and Fessler, 2003), and methods based on the incremental expectation maximization (EM) approach (Hsiao et al., 2002). This paper generalizes the incremental EM approach by introducing a general framework that we call “incremental optimization transfer. ” Like incremental EM methods, the proposed algorithms accelerate convergence speeds and ensure global convergence (to a stationary point) under mild regularity conditions without requiring inconvenient relaxation parameters. The general optimization transfer framework enables the use of a very broad family of non-EM surrogate functions. In particular, this paper provides the first convergent OS-type algorithm for transmission tomography. The general approach is applicable to both monoenergetic and polyenergetic transmission scans as well as to other image reconstruction problems. We propose a particular incremental optimization transfer method for (nonconcave) penalized-likelihood (PL) transmission image reconstruction by using separable paraboloidal surrogates (SPS). Results show that the new “transmission incremental optimization transfer (TRIOT) ” algorithm is faster than nonincremental ordinary SPS and even OS-SPS yet is convergent. I.

### Citations

9033 | Maximum likelihood from incomplete data via the EM algorithm
- Dempster, Laird, et al.
- 1977
(Show Context)
Citation Context ...nt algorithms. In this paper we generalize the incremental EM algorithms by introducing an approach called “incremental optimization transfer”; this is akin to the generalization of the EM algorithms =-=[30]-=- by the optimization transfer principles [4]. In fact, the broad family of “incremental optimization transfer algorithms” includes the ordinary optimization transfer algorithms (e.g., EM), also referr... |

1168 | Linear and Nonlinear Programming - Luenberger - 1984 |

866 | Nonlinear Programming: Theory and Algorithms, (2nd Edition - Bazaraa, Sherali, et al. - 1993 |

805 | A view of the EM algorithm that justifies incremental
- Neal, Hinton
- 1998
(Show Context)
Citation Context ... the complete data, respectively. Defining the augmented objective function as in (5) and then alternating between updating x and one of the ¯xm’s as in Table I leads to the incremental EM algorithms =-=[3]-=-, [28]. The COSEM algorithm [2], [27], a special case of the incremental EM for emission tomography, can be readily derived. APPENDIX B GLOBAL CONVERGENCE PROOF In this appendix we prove the convergen... |

803 | Nonlinear Programming. Athena Scientific - Bertsekas |

303 |
Maximum Likelihood Reconstruction for Emission Tomography
- Shepp, Vardi
- 1982
(Show Context)
Citation Context ...the applicability of our convergence proofs to the emission case. The convergence proofs in Appendix B do not apply to classical ML-EM and COSEM for the emission case in their original forms in [32], =-=[49]-=- and [2], [27] respectively since the EM surrogates used in those algorithms blow up to (negative) infinity on the boundary of the nonnegativity constraint set and therefore they violate the aforement... |

194 | Accelerated image reconstruction using ordered subsets of projection data
- Hudson, Larkin
- 1994
(Show Context)
Citation Context ...bobjective functions: where {Sm} M m=1 Φm(x) = � i∈Sm hi([Ax]i) − β R(x), m = 1, · · · , M, M is a partition of {1, . . . , N}. We use the usual subsets corresponding to downsampled projection angles =-=[6]-=-. Consider the following separable quadratic surrogate φm for the subobjective function Φm: with φm(x; ¯x) = Φm(¯x) + ∇Φm(¯x) ′ (x − ¯x) − 1 2 (x − ¯x)′ ˘ Cm(¯x)(x − ¯x) (13) ˘Cm(x) = diag j {˘cmj(x)}... |

153 |
reconstruction algorithm for emission and transmission tomography. J Comput Assist Tomogr
- Lange, EM
- 1984
(Show Context)
Citation Context ...al surrogates (SPS) [5]. Such quadratic surrogates simplify the maximization. In contrast, the standard EM surrogates for transmission tomography do not 4shave a closed-form maximizer in the “M-step” =-=[32]-=-. The proposed “transmission incremental optimization transfer (TRIOT)” algorithm is convergent yet converges faster than ordinary SPS [5]; it can be further accelerated by the enhancement method in [... |

146 |
Algebraic reconstruction techniques (ART) for three-dimensional electron microscopy and x-ray photography.”J. Theoret
- Gordon, Bender, et al.
- 1970
(Show Context)
Citation Context ...o perform the update iteration incrementally by sequentially (or sometimes randomly [15], [16]) using a subset of the data. Row-action methods [18] including algebraic reconstruction techniques (ART) =-=[19]-=-, [20] can also be viewed as OS type algorithms in which each subset corresponds to a single measurement. The OS algorithms apply successfully to problems where an objective function of interest is a ... |

137 | Space-Alternating Generalized Expectation-Maximization Algorithm
- Fessler, Hero
- 1994
(Show Context)
Citation Context ...ogate: φm(x; ¯x) = E[log f(Zm; x)|Ym = ym; ¯x] , (23) 10 A random vector Z with probability distribution f(z; x) is called an admissible complete-data vector for f(y; x) if f(y, z; x) = f(y|z)f(z; x) =-=[37]-=-, [38]. A special case is that Y is a deterministic function of Z. 19swhich also satisfies the minorization conditions in (3) where Y = (Y1, . . . , YM) and Z = (Z1, . . . , ZM) are some decomposition... |

128 | A local update strategy for iterative reconstruction from projections
- Sauer, Bouman
- 1993
(Show Context)
Citation Context .... In addition, it is easily implemented for system models that use factored system matrices [34], [35] whereas pixel-grouped coordinate ascent based methods require column access of the system matrix =-=[36]-=-–[39]. Section II describes the incremental optimization transfer algorithms in a general framework and discusses their convergence properties. Section III develops incremental optimization transfer a... |

126 |
Optimization transfer using surrogate objective functions (with discussion
- LANGE, HUNTER, et al.
- 2000
(Show Context)
Citation Context ...quiring inconvenient relaxation parameters. The general optimization transfer framework allows the use of a very broad family of non-EM surrogate functions, enabling the development of new algorithms =-=[4]-=-. In particular, this paper provides the first convergent OS-type algorithm for transmission tomography. The general approach is applicable to both monoenergetic and polyenergetic transmission scans a... |

88 | Penalized maximum-likelihood image reconstruction using spacealternating generalized EM algorithms
- Fessler, Hero
- 1995
(Show Context)
Citation Context ...rithm are stationary points of the problem [47, p. 228] or that limits are stationary points [48, p. 312], irrespective of starting points. We adopt the former convention here. 8s(called “ML-EM-3” in =-=[38]-=-) and COSEM algorithms. Moreover, the modified EM surrogate is known to accelerate convergence rates [38]. See [52, Appendix F] for an asymptotic local convergence rate analysis and an illustrative on... |

78 | Monotonic algorithms for transmission tomography
- Erdoğan, Fessler
- 1999
(Show Context)
Citation Context ...tal EM algorithms have been applied to emission tomography 1 One of these conditions is the (strict) concavity of the objective function, which excludes the nonconcave transmission tomography problem =-=[24]-=-. 3s[2], [13], [27], [28]. Recently, Blatt et al. proposed a convergent incremental gradient method, called incremental aggregated gradient (IAG), that does not require relaxation parameters [29]. The... |

74 | A tutorial on MM algorithms
- Hunter, Lange
- 2004
(Show Context)
Citation Context ...otes the n-ary Cartesian product over the set X , such that (i) it is easier to maximize with respect to the first argument than Φm and (ii) it satisfies the following “minorization” conditions [24], =-=[41]-=-: φm(x; x) = Φm(x), ∀x ∈ X φm(x; ¯x) ≤ Φm(x), ∀x, ¯x ∈ X , 2 Such functions are said to be additive-separable in [14]; and to be partially separable [40] when each Φm(x) is a function of fewer compone... |

71 | Incremental subgradient methods for nondifferentiable optimization
- Nedic, Bertsekas
- 2001
(Show Context)
Citation Context ...dient type algorithms are also found in convex programming [14]–[17]. The ordered subsets (or incremental) idea is to perform the update iteration incrementally by sequentially (or sometimes randomly =-=[15]-=-, [16]) using a subset of the data. Row-action methods [18] including algebraic reconstruction techniques (ART) [19], [20] can also be viewed as OS type algorithms in which each subset corresponds to ... |

59 | Ordered subsets algorithms for transmission tomography, Phys Med Biol 44(11
- Erdogan, Fessler
- 1999
(Show Context)
Citation Context ...roblems. We propose a particular incremental optimization transfer method for (nonconcave) penalized-likelihood (PL) transmission image reconstruction by using separable paraboloidal surrogates (SPS) =-=[5]-=- which yield closed-form maximization steps. We found it is very effective to achieve fast convergence rates by starting with an OS algorithm with a large number of subsets and switching to the new “t... |

49 | Convergence rate of incremental subgradient algorithms, Stochastic Optimization: Algorithms and Applications
- Nedić, Bertsekas
- 2000
(Show Context)
Citation Context ...type algorithms are also found in convex programming [14]–[17]. The ordered subsets (or incremental) idea is to perform the update iteration incrementally by sequentially (or sometimes randomly [15], =-=[16]-=-) using a subset of the data. Row-action methods [18] including algebraic reconstruction techniques (ART) [19], [20] can also be viewed as OS type algorithms in which each subset corresponds to a sing... |

46 |
Algebraic reconstruction techniques can be made computationally efficient
- Herman, Meyer
- 1993
(Show Context)
Citation Context ...orm the update iteration incrementally by sequentially (or sometimes randomly [15], [16]) using a subset of the data. Row-action methods [18] including algebraic reconstruction techniques (ART) [19], =-=[20]-=- can also be viewed as OS type algorithms in which each subset corresponds to a single measurement. The OS algorithms apply successfully to problems where an objective function of interest is a sum of... |

45 |
Row-action methods for huge and sparse systems and their applications, SIAm Review 23
- Censor
- 1981
(Show Context)
Citation Context ...[14]–[17]. The ordered subsets (or incremental) idea is to perform the update iteration incrementally by sequentially (or sometimes randomly [15], [16]) using a subset of the data. Row-action methods =-=[18]-=- including algebraic reconstruction techniques (ART) [19], [20] can also be viewed as OS type algorithms in which each subset corresponds to a single measurement. The OS algorithms apply successfully ... |

44 | Component averaging: an efficient iterative parallel algorithm for large and sparse unstructured problems - Censor, Gordon, et al. - 2001 |

38 | A new class of incremental gradient methods for least squares problems - Bertsekas - 1997 |

38 | High-resolution 3D Bayesian image reconstruction using the microPET small-animal scanner
- Qi, Leahy, et al.
- 1998
(Show Context)
Citation Context ... (see Section III for details). It is parallelizable, and the nonnegativity constraint is naturally enforced. In addition, it is easily implemented for system models that use factored system matrices =-=[34]-=-, [35] whereas pixel-grouped coordinate ascent based methods require column access of the system matrix [36]–[39]. Section II describes the incremental optimization transfer algorithms in a general fr... |

33 | Statistical Image Reconstruction for Polyenergetic X-Ray Computed Tomography
- Elbakri, Fessler
- 2002
(Show Context)
Citation Context ...eneral method which can be applied to a variety of problems where an objective function is a sum of functions as in (1) and the OS approach applies: for example, polyenergetic transmission tomography =-=[44]-=-, confocal microscopy [45], and emission tomography [46]. For incremental optimization transfer algorithms one must store M vectors { ¯xm} M m=1 , so one needs more memory compared to ordinary OS algo... |

31 |
Convergence of approximate and incremental subgradient methods for convex optimization
- Kiwiel
(Show Context)
Citation Context ...ts expectation maximization (OS-EM) provides an order-of-magnitude acceleration over its non-OS counterpart, EM [6]. The incremental gradient type algorithms are also found in convex programming [14]–=-=[17]-=-. The ordered subsets (or incremental) idea is to perform the update iteration incrementally by sequentially (or sometimes randomly [15], [16]) using a subset of the data. Row-action methods [18] incl... |

29 |
Accelerating the EMML algorithm and related iterative algorithms by rescaled block-iterative methods
- Byrne
- 1998
(Show Context)
Citation Context ...nce in the sense that less computation is required to achieve nearly the same level of objective increase as with non-OS methods. However, ordinary (unrelaxed) OS algorithms such as OS-EM [6], RBI-EM =-=[8]-=-, and OS-SPS (or OSTR in a context of transmission tomography) [5] generally do not converge to an optimal solution but rather approach a suboptimal limit cycle that consists of as many points as ther... |

28 |
De Pierro, "A rowaction alternative to the EM algorithm for maximizing likelihoods in emission tomography
- Browne, R
- 1996
(Show Context)
Citation Context ...e relaxation parameters, methods based on the incremental EM approach, and incremental aggregated gradient (IAG) methods. Relaxation parameters are used widely to render OS algorithms convergent [1], =-=[7]-=-, [9]–[12], [14]–[16], [21]–[23]. Suitably relaxed algorithms can be shown to converge to an optimal solution under certain regularity conditions 1 [1]. However, since relaxation parameters should be ... |

27 | Statistical approaches in quantitative positron emission tomography
- Leahy, Qi
- 2000
(Show Context)
Citation Context ...Section III for details). It is parallelizable, and the nonnegativity constraint is naturally enforced. In addition, it is easily implemented for system models that use factored system matrices [34], =-=[35]-=- whereas pixel-grouped coordinate ascent based methods require column access of the system matrix [36]–[39]. Section II describes the incremental optimization transfer algorithms in a general framewor... |

26 | Globally Convergent Image Reconstruction for Emission Tomography Using Relaxed Ordered Subsets Algorithms
- Ahn, Fessler
(Show Context)
Citation Context ...lgorithms for transmission tomography have been proposed to date. In contrast, in emission tomography, there are two known families of convergent OS algorithms: methods that use relaxation parameters =-=[1]-=-, and methods based on the incremental expectation maximization (EM) approach [2]. This paper generalizes the incremental EM approach [3] by introducing a general framework that we call “incremental o... |

26 |
Strong underrelaxation in Kaczmarz's method for inconsistent systems
- Censor, Eggermont, et al.
- 1983
(Show Context)
Citation Context ...ethods based on the incremental EM approach, and incremental aggregated gradient (IAG) methods. Relaxation parameters are used widely to render OS algorithms convergent [1], [7], [9]–[12], [14]–[16], =-=[21]-=-–[23]. Suitably relaxed algorithms can be shown to converge to an optimal solution under certain regularity conditions 1 [1]. However, since relaxation parameters should be scheduled to converge to ze... |

26 |
Mathematical methods in image reconstruction
- Natterer, Wubbeling
- 2001
(Show Context)
Citation Context ...in those algorithms blow up to (negative) infinity on the boundary of the nonnegativity constraint set and therefore they violate the aforementioned sufficient conditions. The readers are referred to =-=[50]-=- and [51] for convergence proofs for ML-EM and COSEM respectively for the emission case. However, to avoid the boundary problem one can use a slightly modified EM surrogate in [38, Eq. (20)] for the u... |

25 |
Fast EM-like methods for maximum a posteriori estimates in emission tomography
- Pierro, Yamagishi
- 2001
(Show Context)
Citation Context ...on parameters, methods based on the incremental EM approach, and incremental aggregated gradient (IAG) methods. Relaxation parameters are used widely to render OS algorithms convergent [1], [7], [10]–=-=[12]-=-, [14], [17]–[19], [24]–[26]. Suitably relaxed algorithms can be shown to converge to an optimal solution under certain regularity conditions 1 [1]. However, since relaxation parameters should be sche... |

18 |
Decomposition into functions in the minimization problem
- Kibardin
- 1980
(Show Context)
Citation Context ...subsets expectation maximization (OS-EM) provides an order-of-magnitude acceleration over its non-OS counterpart, EM [6]. The incremental gradient type algorithms are also found in convex programming =-=[14]-=-–[17]. The ordered subsets (or incremental) idea is to perform the update iteration incrementally by sequentially (or sometimes randomly [15], [16]) using a subset of the data. Row-action methods [18]... |

11 |
Convergent block-iterative method for general convex cost functions
- Kudo, Nakazawa, et al.
- 1999
(Show Context)
Citation Context ...axation parameters, methods based on the incremental EM approach, and incremental aggregated gradient (IAG) methods. Relaxation parameters are used widely to render OS algorithms convergent [1], [7], =-=[9]-=-–[12], [14]–[16], [21]–[23]. Suitably relaxed algorithms can be shown to converge to an optimal solution under certain regularity conditions 1 [1]. However, since relaxation parameters should be sched... |

11 | A new convergent MAP reconstruction algorithm for emission tomography using ordered subsets and separable surrogates
- Hsiao, Rangarajan, et al.
(Show Context)
Citation Context ...ithms do not require user-specified relaxation parameters [3]. They are convergent yet faster than ordinary EM algorithms although slower initially than nonconvergent OS-EM type algorithms [2], [26], =-=[27]-=-. Such incremental EM algorithms have been applied to emission tomography 1 One of these conditions is the (strict) concavity of the objective function, which excludes the nonconcave transmission tomo... |

11 |
The Information Geometry of EM Variants for Speech and Image Processing
- Gunawardana
- 2001
(Show Context)
Citation Context ...complete data, respectively. Defining the augmented objective function as in (5) and then alternating between updating x and one of the ¯xm’s as in Table I leads to the incremental EM algorithms [3], =-=[28]-=-. The COSEM algorithm [2], [27], a special case of the incremental EM for emission tomography, can be readily derived. APPENDIX B GLOBAL CONVERGENCE PROOF In this appendix we prove the convergence of ... |

10 |
An Accelerated Convergent Ordered Subset Algorithm for Emission Tomography”, Phys
- Hsiao, Rangarajan, et al.
- 2004
(Show Context)
Citation Context ...]. The proposed “transmission incremental optimization transfer (TRIOT)” algorithm is convergent yet converges faster than ordinary SPS [5]; it can be further accelerated by the enhancement method in =-=[33]-=- or by initializing through a few iterations of OS-SPS (see Section III for details). It is parallelizable, and the nonnegativity constraint is naturally enforced. In addition, it is easily implemente... |

9 |
A provably convergent OS-EM like reconstruction algorithm for emission tomography
- Hsiao, Rangarajan, et al.
(Show Context)
Citation Context ...al EM algorithms do not require user-specified relaxation parameters [3]. They are convergent yet faster than ordinary EM algorithms although slower initially than nonconvergent OS-EM type algorithms =-=[2]-=-, [26], [27]. Such incremental EM algorithms have been applied to emission tomography 1 One of these conditions is the (strict) concavity of the objective function, which excludes the nonconcave trans... |

9 | Image recovery using partitioned-separable paraboloidal surrogate coordinate ascent algorithms
- Sotthivirat, Fessler
(Show Context)
Citation Context ... applied to a variety of problems where an objective function is a sum of functions as in (1) and the OS approach applies: for example, polyenergetic transmission tomography [44], confocal microscopy =-=[45]-=-, and emission tomography [46]. For incremental optimization transfer algorithms one must store M vectors { ¯xm} M m=1 , so one needs more memory compared to ordinary OS algorithms. This can be a prac... |

8 |
Rejoinder to discussion of “Optimization transfer using surrogate objective functions
- Hunter, Lange
- 2000
(Show Context)
Citation Context ...mily of “incremental optimization transfer algorithms” includes the ordinary optimization transfer algorithms (e.g., EM), also referred to as MM (minorize-maximize or majorize-minimize) algorithms in =-=[31]-=-, as a special case where the objective function consists of only one subobjective function. In the incremental optimization transfer approach, for each subobjective function, we define an augmented v... |

7 | A globally convergent regularized ordered-subset EM algorithm for list-mode reconstruction
- Khurd, Hsiao, et al.
- 2004
(Show Context)
Citation Context ...block iterative or incremental gradient methods, have been very popular in the medical imaging community for tomographic image reconstruction due to their remarkably fast “convergence” rates [1], [5]–=-=[13]-=-. For example, ordered subsets expectation maximization (OS-EM) provides an order-of-magnitude acceleration over its non-OS counterpart, EM [6]. The incremental gradient type algorithms are also found... |

7 |
Relaxed ordered-subset algorithm for penalized-likelihood image restoration
- SOTTHIVIRAT, FESSLER
- 2003
(Show Context)
Citation Context ...s based on the incremental EM approach, and incremental aggregated gradient (IAG) methods. Relaxation parameters are used widely to render OS algorithms convergent [1], [7], [9]–[12], [14]–[16], [21]–=-=[23]-=-. Suitably relaxed algorithms can be shown to converge to an optimal solution under certain regularity conditions 1 [1]. However, since relaxation parameters should be scheduled to converge to zero to... |

6 | A fast fully 4-D incremental gradient reconstruction algorithm for list mode PET data
- Li, Asma, et al.
(Show Context)
Citation Context ...ion parameters, methods based on the incremental EM approach, and incremental aggregated gradient (IAG) methods. Relaxation parameters are used widely to render OS algorithms convergent [1], [7], [9]–=-=[12]-=-, [14]–[16], [21]–[23]. Suitably relaxed algorithms can be shown to converge to an optimal solution under certain regularity conditions 1 [1]. However, since relaxation parameters should be scheduled ... |

5 | Large scale unconstrained optimization, in The state of the art in Numerical Analysis - Nocedal - 1997 |

4 |
Convergence of EM variants
- Byrne, Gunawardana
- 1999
(Show Context)
Citation Context ... algorithms do not require user-specified relaxation parameters [3]. They are convergent yet faster than ordinary EM algorithms although slower initially than nonconvergent OS-EM type algorithms [2], =-=[26]-=-, [27]. Such incremental EM algorithms have been applied to emission tomography 1 One of these conditions is the (strict) concavity of the objective function, which excludes the nonconcave transmissio... |

4 |
Properties of optimization transfer algorithms on convex feasible sets
- Jacobson, Fessler
(Show Context)
Citation Context ...addition, it is easily implemented for system models that use factored system matrices [34], [35] whereas pixel-grouped coordinate ascent based methods require column access of the system matrix [36]–=-=[39]-=-. Section II describes the incremental optimization transfer algorithms in a general framework and discusses their convergence properties. Section III develops incremental optimization transfer algori... |

4 |
Comments on “Efficient training algorithms for HMMs using incremental estimation
- Byrne, Gunawardana
- 2000
(Show Context)
Citation Context ...nt algorithm for maximizing F with respect to (x; ¯x1, . . . , ¯xM) [42, p. 270]. It monotonically increases the augmented objective function F , but not necessarily the original objective function Φ =-=[43]-=-. The incremental approach (M > 1) usually leads to faster convergence rates than nonincremental 6 (6)smethods (M = 1) [3]. The incremental EM algorithms [3], [28] including COSEM [2], [27] are a spec... |

3 |
An incremental gradient method that converges with a constant step size
- Blatt, Hero, et al.
(Show Context)
Citation Context ...blem [24]. 3s[2], [13], [27], [28]. Recently, Blatt et al. proposed a convergent incremental gradient method, called incremental aggregated gradient (IAG), that does not require relaxation parameters =-=[29]-=-. The IAG method computes a single subset gradient for each update but aggregates it with the stored subset gradients that were computed in previous iterations. The use of the aggregated gradient to a... |

3 | Fast hybrid algorithms for PET image reconstruction
- Li, Ahn, et al.
(Show Context)
Citation Context ...ems where an objective function is a sum of functions as in (1) and the OS approach applies: for example, polyenergetic transmission tomography [44], confocal microscopy [45], and emission tomography =-=[46]-=-. For incremental optimization transfer algorithms one must store M vectors { ¯xm} M m=1 , so one needs more memory compared to ordinary OS algorithms. This can be a practical limitation when M is ver... |

2 | Block-gradient method for image reconstruction in emission tomography - Kudo, Nakazawa, et al. - 2000 |