#### DMCA

## Probabilistic Generative Modelling (2003)

Venue: | 13th Scandinavian Conference on Image Analysis (SCIA) , Gothenburg, Sweden, volume 2749 of Lecture Notes in Computer Science |

Citations: | 2 - 1 self |

### Citations

793 | Statistical Shape Analysis.
- Dryden, Mardia
- 1998
(Show Context)
Citation Context ... emphasis on the large scale variation, using its inverse puts emphasis of small scale variation. 2 Methods In the following two Sections we will describe how to use two methods, maximum autocorrelation factors [2] and minimum noise fractions [4], for decomposing the tangent space coordinates of a set of shapes into a low-dimensional subspace. 2 R. Larsen and K. B. Hilger The tangent space coordinates are obtained by a generalized Procrustes alignment [9, 10] followed by a projection of the full Procrustes coordinates into the tangent space to the shape space at the full Procrustes mean (e.g. [11]). Let the tangent space coordinates, xi = (xi11, . . . , xi1n, . . . , xid1, . . . , xidn)T , for shapes i = 1, . . . , p with j = 1, . . . , n landmarks in d ∈ {2, 3} dimensions be organised in a p × dn data matrix X = [x1 x2 . . . xp]T . Denote the Procrustes (sample) mean shape x and let it be centered on (0,0) in 2D and (0,0,0) in 3D, further let the origin of the tangent space coordinate system be the mean shape, then X is doubly centered, i.e. columns as well as rows sum to zero. Additionally, it is assumed that the landmarks are sampled on curves (in 2D) and surfaces (in 3D) that allo... |

302 |
Procrustes methods in the statistical analysis of shape,”
- Goodall
- 1991
(Show Context)
Citation Context ...sentation. Bookstein proposed using bending energy and inverse bending energy as metrics in the tangent space [8]. Using the bending energy puts emphasis on the large scale variation, using its inverse puts emphasis of small scale variation. 2 Methods In the following two Sections we will describe how to use two methods, maximum autocorrelation factors [2] and minimum noise fractions [4], for decomposing the tangent space coordinates of a set of shapes into a low-dimensional subspace. 2 R. Larsen and K. B. Hilger The tangent space coordinates are obtained by a generalized Procrustes alignment [9, 10] followed by a projection of the full Procrustes coordinates into the tangent space to the shape space at the full Procrustes mean (e.g. [11]). Let the tangent space coordinates, xi = (xi11, . . . , xi1n, . . . , xid1, . . . , xidn)T , for shapes i = 1, . . . , p with j = 1, . . . , n landmarks in d ∈ {2, 3} dimensions be organised in a p × dn data matrix X = [x1 x2 . . . xp]T . Denote the Procrustes (sample) mean shape x and let it be centered on (0,0) in 2D and (0,0,0) in 3D, further let the origin of the tangent space coordinate system be the mean shape, then X is doubly centered, i.e. col... |

276 | A transformation for ordering multispectral data in terms of image quality with implications for noise removal
- Green, Berman, et al.
- 1988
(Show Context)
Citation Context ...ent procedures for PCA using nonEuclidean metrics have been proposed. The maximum autocorrelation factor (MAF) transform proposed by Switzer [2] defines maximum spatial autocorrelation as the optimality criterion for extracting linear combinations of multispectral images. Contrary to this PCA seeks linear combinations that exhibit maximum variance. Because imaged phenomena often exhibit some sort of spatial coherence spatial autocorrelation is often a better optimality criterion than variance. We have previously adapted the MAF transform for analysis of tangent space shape coordinates [3]. In [4] the noise adjusted PCA or the minimum noise fraction (MNF) transformations were used for decomposition of multispectral satellite images. The MNF transform is a PCA in a metric space defined by a noise covariance matrix estimated from the data. For image data the noise process covariance is conveniently estimated using spatial filtering. In [5] the MNF transform is applied to texture modelling in active appearance models [6], and in [7] to multivariate images in extracting a discriminatory representation. Bookstein proposed using bending energy and inverse bending energy as metrics in the tan... |

199 |
Morphometric tools for landmark data.
- Bookstein
- 1991
(Show Context)
Citation Context ...adjusted PCA or the minimum noise fraction (MNF) transformations were used for decomposition of multispectral satellite images. The MNF transform is a PCA in a metric space defined by a noise covariance matrix estimated from the data. For image data the noise process covariance is conveniently estimated using spatial filtering. In [5] the MNF transform is applied to texture modelling in active appearance models [6], and in [7] to multivariate images in extracting a discriminatory representation. Bookstein proposed using bending energy and inverse bending energy as metrics in the tangent space [8]. Using the bending energy puts emphasis on the large scale variation, using its inverse puts emphasis of small scale variation. 2 Methods In the following two Sections we will describe how to use two methods, maximum autocorrelation factors [2] and minimum noise fractions [4], for decomposing the tangent space coordinates of a set of shapes into a low-dimensional subspace. 2 R. Larsen and K. B. Hilger The tangent space coordinates are obtained by a generalized Procrustes alignment [9, 10] followed by a projection of the full Procrustes coordinates into the tangent space to the shape space at ... |

190 | Training Models of Shape from Sets of Examples
- Cootes, Taylor, et al.
- 1992
(Show Context)
Citation Context ... Abstract. The contribution of this paper is the adaption of data driven methods for decomposition of tangent shape variability proposed in a probabilistic framework. By Bayesian model selection we compare two generative model representations derived by principal components analysis and by maximum autocorrelation factors analysis. 1 Introduction For the analysis and interpretation of multivariate observations a standard method has been the application of principal component analysis (PCA) to extract latent variables. Cootes et al. applied PCA to the analysis of tangent space shape coordinates [1]. For various purposes different procedures for PCA using nonEuclidean metrics have been proposed. The maximum autocorrelation factor (MAF) transform proposed by Switzer [2] defines maximum spatial autocorrelation as the optimality criterion for extracting linear combinations of multispectral images. Contrary to this PCA seeks linear combinations that exhibit maximum variance. Because imaged phenomena often exhibit some sort of spatial coherence spatial autocorrelation is often a better optimality criterion than variance. We have previously adapted the MAF transform for analysis of tangent spa... |

181 | Separation of a mixture of independent signals using time delayed correlations,
- Molgedey, Schuster
- 1994
(Show Context)
Citation Context ...by the set of conjugate eigenvectors of Σ∆ wrt. Σ, W = [w1, . . . , wm], corresponding to the eigenvalues κ1 ≤ · · · ≤ κm [2]. The resulting new uncorrelated variables are ordered so that the first MAF is the linear combination that exhibits maximum autocorrelation. The autocorrelation of the ith component is 1 − 12κi. We assume first and second order stationarity of the data. One problem now arise, namely, how should we choose ∆. Switzer suggests that we estimate Σ∆ for a shift in lag 1. Blind source separation by independent components analysis using the Molgedey-Schuster (MS-ICA) algorithm [12] is equivalent to MAF [3]. The purpose of this algorithm is to separate independent signals from linear mixings. MS-ICA does this by exploiting differences in autocorrelation structure between the independent signals. Kolenda et al. [13] use an iterative procedure for identifying the optimal lags based on the sum of pairwise absolute differences between the autocorrelations of the estimated independent components. In this study we use Switzers original suggestion. This is based on the assumption that the noise is separated from the interesting latent variables in terms of autocorrelation alrea... |

101 | Automatic choice of dimensionality for PCA
- Minka
(Show Context)
Citation Context ...]. The resulting new variables are ordered so that the first MNF is the linear combination that exhibits maximum SNR. The ith MNF is the linear combination that exhibits the highest SNR subject to it being uncorrelated to the previous MNFs. The SNR of the ith component is κi − 1. If the matrices in Equations (3) and (4) are singular the solution must be found in the affine support of the matrix in the denominator, e.g. by means of a generalized singular value decomposition. 4 R. Larsen and K. B. Hilger 2.3 Evaluation of point distribution models by probabilistic reconstruction Following Minka [14] we use a probabilistic principal components analysis model for choice of dimensionality. Let a multivariate response X of p dimensions be modelled by a linear combination of a set of basis vectors hi, i = 1, . . . , k plus noise X = k∑ i=1 hibi + m + N = Hb + m + N (5) where N ∈ N(0, ΣN ), and b has dimension k < p. The vector m defines the mean of X, while H and ΣN defines its variance. For PCA the noise variance is spherical, i.e. ΣN = vIp. Furthermore, we assume a spherical Gaussian prior density for b, b ∈ N(0, Ik). For this model the maximum likelihood estimators for the model parameters... |

46 |
Min/max autocorrelation factors for multivariate spatial imagery,”
- Switzer
- 1985
(Show Context)
Citation Context ...esian model selection we compare two generative model representations derived by principal components analysis and by maximum autocorrelation factors analysis. 1 Introduction For the analysis and interpretation of multivariate observations a standard method has been the application of principal component analysis (PCA) to extract latent variables. Cootes et al. applied PCA to the analysis of tangent space shape coordinates [1]. For various purposes different procedures for PCA using nonEuclidean metrics have been proposed. The maximum autocorrelation factor (MAF) transform proposed by Switzer [2] defines maximum spatial autocorrelation as the optimality criterion for extracting linear combinations of multispectral images. Contrary to this PCA seeks linear combinations that exhibit maximum variance. Because imaged phenomena often exhibit some sort of spatial coherence spatial autocorrelation is often a better optimality criterion than variance. We have previously adapted the MAF transform for analysis of tangent space shape coordinates [3]. In [4] the noise adjusted PCA or the minimum noise fraction (MNF) transformations were used for decomposition of multispectral satellite images. Th... |

36 | Computer-aided diagnosis in chest radiography: a survey",
- Ginneken, Romeny, et al.
- 2001
(Show Context)
Citation Context ... that we propose on a dataset consisting of 2D annotations of the outline of the right and left lung from 115 standard PA chest radiographs. The chest radiographs were randomly selected from a tuberculosis screening program and contained normal as well as abnormal cases. The annotation process was conducted by identification of three anatomical landmarks on each lung outline followed by equidistant distribution of pseudo landmarks along the 3 resulting segments of the outline. In Fig. 1(b) the landmarks used for annotation are shown. Each lung field is annotated independently by two observers [15]. In Fig. 2 results of a five-fold CV study of the loglikelihood is shown. The figure shows the average performance of the generative PCA and MAF models and the one standard deviation bounds for a given model complexity. For the PCA based model the LL analysis attains its maximum at 18 dimensions, whereas the MAF has its maximum at 30 dimensions. Truncation of the models is typically obtained by tracking the lower one standard deviation bound backward, leading to model complexities of 13 and 22 dimensions for respectively the PCA and the MAF analysis. Although the MAF basis indicates a higher ... |

26 | Signal detection using ICA: application to chat room topic spotting, In:
- Kolenda, Hansen, et al.
- 2001
(Show Context)
Citation Context ...bits maximum autocorrelation. The autocorrelation of the ith component is 1 − 12κi. We assume first and second order stationarity of the data. One problem now arise, namely, how should we choose ∆. Switzer suggests that we estimate Σ∆ for a shift in lag 1. Blind source separation by independent components analysis using the Molgedey-Schuster (MS-ICA) algorithm [12] is equivalent to MAF [3]. The purpose of this algorithm is to separate independent signals from linear mixings. MS-ICA does this by exploiting differences in autocorrelation structure between the independent signals. Kolenda et al. [13] use an iterative procedure for identifying the optimal lags based on the sum of pairwise absolute differences between the autocorrelations of the estimated independent components. In this study we use Switzers original suggestion. This is based on the assumption that the noise is separated from the interesting latent variables in terms of autocorrelation already in lag 1. For shape analysis decomposition of the datamatrix X using MAF is carried out in Q-mode. In 2D the difference process covariance matrix Σ∆ is estimated from the lag 1 difference process of landmarks along the contours of the... |

4 | Q-MAF shape decomposition
- Larsen, Eiriksson, et al.
- 2001
(Show Context)
Citation Context ...s different procedures for PCA using nonEuclidean metrics have been proposed. The maximum autocorrelation factor (MAF) transform proposed by Switzer [2] defines maximum spatial autocorrelation as the optimality criterion for extracting linear combinations of multispectral images. Contrary to this PCA seeks linear combinations that exhibit maximum variance. Because imaged phenomena often exhibit some sort of spatial coherence spatial autocorrelation is often a better optimality criterion than variance. We have previously adapted the MAF transform for analysis of tangent space shape coordinates [3]. In [4] the noise adjusted PCA or the minimum noise fraction (MNF) transformations were used for decomposition of multispectral satellite images. The MNF transform is a PCA in a metric space defined by a noise covariance matrix estimated from the data. For image data the noise process covariance is conveniently estimated using spatial filtering. In [5] the MNF transform is applied to texture modelling in active appearance models [6], and in [7] to multivariate images in extracting a discriminatory representation. Bookstein proposed using bending energy and inverse bending energy as metrics in... |

2 | A noise robust statistical texture model,”
- Hilger, Stegmann, et al.
- 2002
(Show Context)
Citation Context ...ce. Because imaged phenomena often exhibit some sort of spatial coherence spatial autocorrelation is often a better optimality criterion than variance. We have previously adapted the MAF transform for analysis of tangent space shape coordinates [3]. In [4] the noise adjusted PCA or the minimum noise fraction (MNF) transformations were used for decomposition of multispectral satellite images. The MNF transform is a PCA in a metric space defined by a noise covariance matrix estimated from the data. For image data the noise process covariance is conveniently estimated using spatial filtering. In [5] the MNF transform is applied to texture modelling in active appearance models [6], and in [7] to multivariate images in extracting a discriminatory representation. Bookstein proposed using bending energy and inverse bending energy as metrics in the tangent space [8]. Using the bending energy puts emphasis on the large scale variation, using its inverse puts emphasis of small scale variation. 2 Methods In the following two Sections we will describe how to use two methods, maximum autocorrelation factors [2] and minimum noise fractions [4], for decomposing the tangent space coordinates of a set... |

2 |
A scheme for initial exploratory data analysis of multivariate image data,”
- Hilger, Nielsen, et al.
- 2001
(Show Context)
Citation Context ...ion is often a better optimality criterion than variance. We have previously adapted the MAF transform for analysis of tangent space shape coordinates [3]. In [4] the noise adjusted PCA or the minimum noise fraction (MNF) transformations were used for decomposition of multispectral satellite images. The MNF transform is a PCA in a metric space defined by a noise covariance matrix estimated from the data. For image data the noise process covariance is conveniently estimated using spatial filtering. In [5] the MNF transform is applied to texture modelling in active appearance models [6], and in [7] to multivariate images in extracting a discriminatory representation. Bookstein proposed using bending energy and inverse bending energy as metrics in the tangent space [8]. Using the bending energy puts emphasis on the large scale variation, using its inverse puts emphasis of small scale variation. 2 Methods In the following two Sections we will describe how to use two methods, maximum autocorrelation factors [2] and minimum noise fractions [4], for decomposing the tangent space coordinates of a set of shapes into a low-dimensional subspace. 2 R. Larsen and K. B. Hilger The tangent space coo... |