Results 1 
6 of
6
Height Ridge Computation and Filtering for Visualization
 Proceedings of IEEE VGTC Paci Visualization Symposium 2008
, 2008
"... Motivated by the growing interest in the use of ridges in scientific visualization, we analyze the two height ridge definitions by Eberly and Lindeberg. We propose a raw feature definition leading to a superset of the ridge points as obtained by these two definitions. The set of raw feature points ..."
Abstract

Cited by 17 (5 self)
 Add to MetaCart
(Show Context)
Motivated by the growing interest in the use of ridges in scientific visualization, we analyze the two height ridge definitions by Eberly and Lindeberg. We propose a raw feature definition leading to a superset of the ridge points as obtained by these two definitions. The set of raw feature points has the correct dimensionality, and it can be narrowed down to either Eberly’s or Lindeberg’s ridges by using Boolean filters which we formulate. While the straightforward computation of height ridges requires explicit eigenvalue calculation, this can be avoided by using an equivalent definition of the raw feature set, for which we give a derivation. We describe efficient algorithms for two special cases, height ridges of dimension one and of codimension one. As an alternative to the aforementioned filters, we propose a new criterion for filtering raw features based on the distance between contours which generally makes better decisions, as we demonstrate on a few synthetic fields, a topographical dataset, and a fluid flow simulation dataset. The same set of test data shows that it is unavoidable to use further filters to eliminate false positives. For this purpose, we use the angle between feature tangent and slope line as a quality measure and, based on this, formalize a previously published filter.
Euler Principal Component Analysis
 INT J COMPUT VIS
, 2012
"... Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ2norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA, whi ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ2norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA, which we call EulerPCA (ePCA). In particular, our algorithm utilizes a robust dissimilarity measure based on the Euler representation of complex numbers. We show that EulerPCA retains PCA’s desirable properties while suppressing outliers. Moreover, we formulate EulerPCA in an incremental learning framework which allows for efficient computation. In our experiments we apply EulerPCA to three different computer vision applications for which our method performs comparably with other stateoftheart approaches.
TANGENT SPACE ESTIMATION FOR SMOOTH EMBEDDINGS OF RIEMANNIAN MANIFOLDS
"... Abstract. Numerous dimensionality reduction problems in data analysis involve the recovery of lowdimensional models or the learning of manifolds underlying sets of data. Many manifold learning methods require the estimation of the tangent space of the manifold at a point from locally available data ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Numerous dimensionality reduction problems in data analysis involve the recovery of lowdimensional models or the learning of manifolds underlying sets of data. Many manifold learning methods require the estimation of the tangent space of the manifold at a point from locally available data samples. Local sampling conditions such as (i) the size of the neighborhood (sampling width) and (ii) the number of samples in the neighborhood (sampling density) affect the performance of learning algorithms. In this work, we propose a theoretical analysis of local sampling conditions for the estimation of the tangent space at a point P lying on a mdimensional Riemannian manifold S in Rn. Assuming a smooth embedding of S in Rn, we estimate the tangent space TP S by performing a Principal Component Analysis (PCA) on points sampled from the neighborhood of P on S. Our analysis explicitly takes into account the second order properties of the manifold at P, namely the principal curvatures as well as the higher order terms. We consider a random sampling framework and leverage recent results from random matrix theory to derive conditions on the sampling width and the local sampling density for an accurate estimation of tangent subspaces. We measure the estimation accuracy by the angle between the estimated tangent space TP S and the true tangent space TP S and we give conditions for this angle to be bounded with high probability. In particular, we observe that the local sampling conditions are highly dependent on the correlation between the components in the secondorder local approximation of the manifold. We finally provide numerical simulations to validate our theoretical findings. 1.
Online Kernel Slow Feature Analysis for Temporal Video Segmentation and Tracking
"... Abstract—Slow feature analysis (SFA) is a dimensionality reduction technique which has been linked to how visual brain cells work. In recent years, SFA was adopted for computer vision tasks. In this paper, we propose an exact kernel SFA (KSFA) framework for positive definite and indefinite kernels i ..."
Abstract
 Add to MetaCart
Abstract—Slow feature analysis (SFA) is a dimensionality reduction technique which has been linked to how visual brain cells work. In recent years, SFA was adopted for computer vision tasks. In this paper, we propose an exact kernel SFA (KSFA) framework for positive definite and indefinite kernels in Krein space. We then formulate an online KSFA which employs a reduced set expansion. Finally, by utilizing a special kind of kernel family, we formulate exact online KSFA for which no reduced set is required. We apply the proposed system to develop a SFAbased change detection algorithm for stream data. This framework is employed for temporal video segmentation and tracking. We test our setup on synthetic and real data streams. When combined with an online learning tracking system, the proposed change detection approach improves upon tracking setups that do not utilize change detection. Index Terms—Slow feature analysis, online kernel learning, change detection, temporal segmentation, tracking I.
Ecole Polytechnique Fédérale de Lausanne Supervisors
"... In this work we do a theoretical analysis of the local sampling conditions for points lying on a quadratic embedding of a Riemannian manifold in a Euclidean space. The embedding is assumed to be quadratic at a reference point P. Our analysis is based on the following criteria: (i) Local reconstructi ..."
Abstract
 Add to MetaCart
(Show Context)
In this work we do a theoretical analysis of the local sampling conditions for points lying on a quadratic embedding of a Riemannian manifold in a Euclidean space. The embedding is assumed to be quadratic at a reference point P. Our analysis is based on the following criteria: (i) Local reconstruction error (ii) Local tangent space estimation accuracy. In the local reconstruction error analysis we describe sampling conditions in the neighbourhood of P such that the average reconstruction error of the samples after orthogonal projection on the local tangent space, satisfies a given upper bound. We derive a lower bound on the number of neighbouring samples which probabilistically guarantees that a predefined local reconstruction error criterion will be satisfied. In local tangent space estimation analysis, we analyze the locally estimated linear subspace, which is optimal in the least squares sense and passes through P. The tangent space at P is estimated using the samples lying in its neighbourhood. Sampling conditions for the neighbourhood points are derived so that the “angle ” [2] between the estimated tangent space and the original tangent space at P is upper bounded. We again consider both probabilistic and nonprobabilistic sampling conditions for this criterion. We derive a lower bound on the number of neighbouring samples which probabilistically guarantees an upper bound on the “angle ” between the estimated tangent space and the original tangent space. 2
Information and Inference: A Journal of the IMA (2013) 2, 69–114 doi:10.1093/imaiai/iat003 Tangent space estimation for smooth embeddings of Riemannian manifolds R
"... Numerous dimensionality reduction problems in data analysis involve the recovery of lowdimensional models or the learning of manifolds underlying sets of data. Many manifold learning methods require the estimation of the tangent space of the manifold at a point from locally available data samples. ..."
Abstract
 Add to MetaCart
(Show Context)
Numerous dimensionality reduction problems in data analysis involve the recovery of lowdimensional models or the learning of manifolds underlying sets of data. Many manifold learning methods require the estimation of the tangent space of the manifold at a point from locally available data samples. Local sampling conditions such as (i) the size of the neighborhood (sampling width) and (ii) the number of samples in the neighborhood (sampling density) affect the performance of learning algorithms. In this work, we propose a theoretical analysis of local sampling conditions for the estimation of the tangent space at a point P lying on an mdimensional Riemannian manifold S in Rn. Assuming a smooth embedding of S in Rn, we estimate the tangent space TPS by performing a principal component analysis (PCA) on points sampled from the neighborhood of P on S. Our analysis explicitly takes into account the secondorder properties of the manifold at P, namely the principal curvatures as well as the higherorder terms. We consider a random sampling framework and leverage recent results from random matrix theory to derive conditions on the sampling width and the local sampling density for an accurate estimation of tangent subspaces. We measure the estimation accuracy by the angle between the estimated tangent space T̂PS and the true tangent space TPS and we give conditions for this angle to be bounded with high probability. In particular, we observe that the local sampling conditions are highly dependent on the correlation between the components in the secondorder local approximation of the manifold. We finally provide numerical simulations to validate our theoretical findings.