## Nonlinear Shape Statistics via Kernel Spaces (2001)

### Cached

### Download Links

- [www.cvgpr.uni-mannheim.de]
- [www.cvgpr.uni-mannheim.de]
- [ipa.iwr.uni-heidelberg.de]
- [vision.in.tum.de]
- DBLP

### Other Repositories/Bibliography

Venue: | Pattern Recognition, volume 2191 of LNCS |

Citations: | 11 - 2 self |

### BibTeX

@INPROCEEDINGS{Cremers01nonlinearshape,

author = {Daniel Cremers and Timo Kohlberger and Christoph Schnörr},

title = {Nonlinear Shape Statistics via Kernel Spaces},

booktitle = {Pattern Recognition, volume 2191 of LNCS},

year = {2001},

pages = {269--276},

publisher = {Springer}

}

### OpenURL

### Abstract

We present a novel approach for representing shape knowledge in terms of example views of 3D objects. Typically, such data sets exhibit a highly nonlinear structure with distinct clusters in the shape vector space, preventing the usual encoding by linear principal component analysis (PCA). For this reason, we propose a nonlinear Mercer kernel PCA scheme which takes into account both the projection distance and the within-subspace distance in a high-dimensional feature space. The comparison of our approach with supervised mixture models indicates that the statistics of example views of distinct 3D objects can fairly well be learned and represented in a completely unsupervised way.

### Citations

2386 | Support vector network
- Cortes, Vapnik
- 1995
(Show Context)
Citation Context ...or an appropriate kernel function k(x; y) dening the scalar product on Y : k(x; y) = ((x); (y)) : (2) With great success, this Mercer kernel approach has been used for the purpose of classication [5]. By contrast, our aim in the present paper is that of constructing a similarity measure by probability density estimation. We therefore propose to approximate the nonlinearly mapped data points (x) ... |

1422 | Vapnik –“A training algorithm for optimal margin classifiers
- Boser, Guyon, et al.
- 1992
(Show Context)
Citation Context ... the number of underlying classes [3], or they involve an intricate model construction [2]. An elegant and promising way to avoid these drawbacks is to employ feature spaces induced by Mercer kernels =-=[-=-1], in order to indirectly model a nonlinear transformation (x) of the original data from a space X into a potentially innite-dimensional space Y , aspiring a simpler distribution of the mapped data i... |

1145 | Nonlinear component analysis as a kernel eigenvalue problem
- Scholkopf, Smola, et al.
- 1998
(Show Context)
Citation Context ...ough being fully unsupervised. Our method of density estimation is related to the so-called kernel PCA, which shall therefore be reviewed in the next section. 2 Kernel Principal Component Analysis In =-=[13-=-] a method to perform nonlinear principal component analysis is proposed. This is done by assuming an appropriate nonlinear transformation (x i ) of the training data fx i g i=1;:::;` into a space Y a... |

1099 |
Active shape models: their training and application
- Cootes, Taylor, et al.
- 1995
(Show Context)
Citation Context ... We focus on encoding views of distinct objects in an unsupervised way. In most of the models of shape variability it is assumed that the training shapes dene some linear subspace of the shape space [=-=4-=-]. Though quite powerful in many applications, this assumption only has limited validity if the observed deformations are more complex. It fully breaks down once shapes of dierent classes are included... |

610 | Probabilistic Visual Learning for Object Representation
- Moghaddam, Pentlend
- 1997
(Show Context)
Citation Context ... It corresponds to the distance of a mapped point (z) to the feature space F , which is the subspace of Y spanned by the mapped training data. Following an analogous derivation in the linear setting [=-=11]-=-, we call this term distance from feature space (DFFS). Thesrst term in (11) is called distance in feature space (DIFS). Both of these distances are visualized in Figure 2: the original data is mapped... |

334 | Statistical shape influence in geodesic active contours
- Leventon, Grimson, et al.
- 2000
(Show Context)
Citation Context ...isual input and internallyrepresented, previouslyacquired knowledge. For the case of image segmentation, prior information on the shape of expected objects can drasticallyimprove segmentation results =-=[9,10]-=-. A conceptuallyattractive wayof incorporating prior information is given bya variational approach in which external image information and statisticallyacquired knowledge about the shape of expected o... |

102 | A mixture model for representing shape variation
- Cootes, Taylor
- 1999
(Show Context)
Citation Context ...to model nonlinear shape variability. They often suer from certain drawbacks, namely they assume some prior knowledge about the structure of the nonlinearity [8], or the number of underlying classes [=-=3]-=-, or they involve an intricate model construction [2]. An elegant and promising way to avoid these drawbacks is to employ feature spaces induced by Mercer kernels [1], in order to indirectly model a n... |

39 | Kernel pca pattern reconstruction via approximate pre-images
- Scholkopf, Mika, et al.
- 1998
(Show Context)
Citation Context ... k = ` X i=1 k i (x i ) ; (3) with known coecients k i . The projection of a mapped point (z) on the eigenvectorsV k is therefore given by:sk := (V k ; (z)) = ` X i=1 k i k(x i ; z) : (4) In [12] this kernel PCA is applied to pattern reconstruction. To this end the authors propose to minimize the distance (z) = jjP r (z) (z)jj 2 (5) of a mapped sample point to its projection onto the subsp... |

34 | A hierarchical Markov modeling approach for the segmentation and tracking of deformable shapes
- Kervrann, Heitz
- 1998
(Show Context)
Citation Context ...al input and internally represented, previously acquired knowledge. For the case of image segmentation, prior information on the shape of expected objects can drastically improve segmentation results =-=[9, 10]-=-. A conceptually attractive way of incorporating prior information is given by a variational approach in which external image information and statistically acquired knowledge about the shape of expect... |

30 | Diffusion-snakes: combining statistical shape knowledge and image information in a variational framework
- Cremers, Schnörr
- 2001
(Show Context)
Citation Context ...rior information is given bya variational approach in which external image information and statisticallyacquired knowledge about the shape of expected objects are combined in a single cost functional =-=[6]-=-: E = Eimage + Eshape . (1) The present paper is concerned with the question of how to construct such a shape energy, which measures the similarity of a given shape to a set of training shapes. We foc... |

28 |
Nonlinear modeling of scattered multivariate data and its application to shape change
- Chalmond, Girard
- 1999
(Show Context)
Citation Context ...r from certain drawbacks, namely they assume some prior knowledge about the structure of the nonlinearity [8], or the number of underlying classes [3], or they involve an intricate model construction =-=[2-=-]. An elegant and promising way to avoid these drawbacks is to employ feature spaces induced by Mercer kernels [1], in order to indirectly model a nonlinear transformation (x) of the original data fro... |

17 | Automated pivot location for the cartesian-polar hybrid point distribution model
- Heap, Hogg
- 1996
(Show Context)
Citation Context ... Several approaches have been undertaken to model nonlinear shape variability. They often suer from certain drawbacks, namely they assume some prior knowledge about the structure of the nonlinearity [=-=8]-=-, or the number of underlying classes [3], or they involve an intricate model construction [2]. An elegant and promising way to avoid these drawbacks is to employ feature spaces induced by Mercer kern... |

4 |
Diusion{snakes: Combining statistical shape knowledge and image information in a variational framework
- Cremers, Schnorr, et al.
- 2001
(Show Context)
Citation Context ...or information is given by a variational approach in which external image information and statistically acquired knowledge about the shape of expected objects are combined in a single cost functional =-=[6]-=-: E = E image +E shape : (1) The present paper is concerned with the question of how to construct such a shape energy, which measures the similarity of a given shape to a set of training shapes. We fo... |

4 |
Statistical shape in in geodesic active contours
- Leventon, Grimson, et al.
(Show Context)
Citation Context ...al input and internally represented, previously acquired knowledge. For the case of image segmentation, prior information on the shape of expected objects can drastically improve segmentation results =-=[9, 10]-=-. A conceptually attractive way of incorporating prior information is given by a variational approach in which external image information and statistically acquired knowledge about the shape of expect... |

3 |
Diusion{snakes using statistical shape knowledge
- Cremers, Schnorr, et al.
(Show Context)
Citation Context ...: (7) Let f i g i=1;:::;r be the nonzero eigenvalues of and V the matrix containing the respective eigenvectors VK . In general is not invertible and needs to be appropriately regularized (cf. [7]), for example by replacing all zero eigenvalues by the smallest non-zero eigenvalue r . The inverse of this matrix is: = V 0 B B B @ 1 1 1 2 . . . 1 r 1 C C C A V t + 1 r I V V t ... |

2 |
and Müller K.-R. Kernel PCA pattern reconstruction via approximate pre-images
- Schölkopf, Mika, et al.
- 1998
(Show Context)
Citation Context ...g data: ℓ� Vk = α k i Φ(xi) , (3) i=1 with known coefficients αk i . The projection of a mapped point Φ(z) on the eigenvector Vk is therefore given by: βk := (Vk,Φ(z)) = ℓ� i=1 α k i k(xi,z) . (4) In =-=[12]-=- this kernel PCA is applied to pattern reconstruction. To this end the authors propose to minimize the distance ρ(z) =||PrΦ(z) − Φ(z)|| 2 (5) of a mapped sample point to its projection onto the subspa... |