Results 1 
8 of
8
New analysis of manifold embeddings and signal recovery from compressive measurements. arXiv:1306.4748
"... Compressive Sensing (CS) exploits the surprising fact that the information contained in a sparse signal can be preserved in a small number of compressive, often random linear measurements of that signal. Strong theoretical guarantees have been established concerning the embedding of a sparse signal ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Compressive Sensing (CS) exploits the surprising fact that the information contained in a sparse signal can be preserved in a small number of compressive, often random linear measurements of that signal. Strong theoretical guarantees have been established concerning the embedding of a sparse signal family under a random measurement operator and on the accuracy to which sparse signals can be recovered from noisy compressive measurements. In this paper, we address similar questions in the context of a different modeling framework. Instead of sparse models, we focus on the broad class of manifold models, which can arise in both parametric and nonparametric signal families. Using tools from the theory of empirical processes, we improve upon previous results concerning the embedding of lowdimensional manifolds under random measurement operators. We also establish both deterministic and probabilistic instanceoptimal bounds in `2 for manifoldbased signal recovery and parameter estimation from noisy compressive measurements. In line with analogous results for sparsitybased CS, we conclude that much stronger bounds are possible in the probabilistic setting. Our work supports the growing evidence that manifoldbased models can be used with high accuracy in compressive signal processing.
A FAST MULTISCALE FRAMEWORK FOR DATA IN HIGHDIMENSIONS: MEASURE ESTIMATION, ANOMALY DETECTION, AND COMPRESSIVE MEASUREMENTS*
"... Data sets are often modeled as samples from some probability distribution lying in a very high dimensional space. In practice, they tend to exhibit low intrinsic dimensionality, which enables both fast construction of efficient data representations and solving statistical tasks such as regression of ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Data sets are often modeled as samples from some probability distribution lying in a very high dimensional space. In practice, they tend to exhibit low intrinsic dimensionality, which enables both fast construction of efficient data representations and solving statistical tasks such as regression of functions on the data, or even estimation of the probability distribution from which the data is generated. In this paper we introduce a novel multiscale density estimator for high dimensional data and apply it to the problem of detecting changes in the distribution of dynamic data, or in a time series of data sets. We also show that our data representations, which are not standard sparse linear expansions, are amenable to compressed measurements. Finally, we test our algorithms on both synthetic data and a real data set consisting of a times series of hyperspectral images, and demonstrate their high accuracy in the detection of anomalies.
Restricted isometry property of subspace projection matrix under random compression
 Signal Processing Letters, IEEE
, 2015
"... Structures play a signicant role in the eld of signal processing. As a representative of structural data, low rank matrix along with its restricted isometry property (RIP) has been an important research topic in compressive signal processing. Subspace projection matrix is a kind of low rank matrix w ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Structures play a signicant role in the eld of signal processing. As a representative of structural data, low rank matrix along with its restricted isometry property (RIP) has been an important research topic in compressive signal processing. Subspace projection matrix is a kind of low rank matrix with additional structure, which allows for further reduction of its intrinsic dimension. This leaves room for improving its own RIP, which could work as the foundation of compressed subspace projection matrix recovery. In this work, we study the RIP of subspace projection matrix under random orthonormal compression. Considering the fact that subspace projection matrices of s dimensional subspaces in RN form an s(Ns) dimensional submanifold in RNN, our main concern is transformed to the stable embedding of such submanifold into RNN. The result is that by O(s(N s) logN) number of random measurements the RIP of subspace projection matrix is guaranteed.
A compressed sensing framework for magnetic resonance fingerprinting
, 2013
"... ar ..."
(Show Context)
Office: 293 Physics Bldg.
"... Synopsis of course content We will cover some basic materials in random matrix theory (with applications to compressed sensing and signal processing), nonparametric statistical estimation and machine learning, and problems about the geometry of high dimensional data sets. • Random matrices. Basic th ..."
Abstract
 Add to MetaCart
(Show Context)
Synopsis of course content We will cover some basic materials in random matrix theory (with applications to compressed sensing and signal processing), nonparametric statistical estimation and machine learning, and problems about the geometry of high dimensional data sets. • Random matrices. Basic theory of random matrices, following [8]: basic concentration inequalities, subgaussian random variables, singular values of random matrices. Applications: to compressed sensing theory; to numerical linear algebra (a.k.a. how to compute quickly highly accurate lowrank approximate Singular Value Decompositions, with high probability [9]). • Nonparametric estimation: Basic results in nonparametric density estimation and nonparametric regression (e.g. following the first chapter [7]) in low dimensions. Obstructions in the highdimensional setting, curse of dimensionality. Applications: denoising of signals (the classic DonohoJohnstone paper [4] and the compressed sensing results). • Approximation theory. A primer in nonlinear approximation of functions [3], especially for wavelets and other multiscale approximations. Multiscale approximation of functions in high dimensions [2]. Attacking the curse of dimensionality. • Multiscale Analysis in High dimensions. Multiscale geometric constructionsin metric spaces, associated algorithms and applications. Multiscale SVD and Geometric Multiresolution analyses, and their applications to dictionary learning, regression, manifold learning, compressive sensing [6, 1, 5]. • Optimal transport. A primer in optimal transport theory and Wasserstein metrics between distributions. Current research: multiscale approximationtheory in the space of probability measures with respect to Wasserstein metrics.
“Sensing multiscale structures in highdimensional data”
, 2015
"... Plasma behavior is well understood to span many temporal and spatial scales. Consequently, many wellresolved numerical simulations generate massive amounts of data, resulting in data management, analysis, and visualization challenges. This will clearly be exacerbated as we move towards exascale com ..."
Abstract
 Add to MetaCart
(Show Context)
Plasma behavior is well understood to span many temporal and spatial scales. Consequently, many wellresolved numerical simulations generate massive amounts of data, resulting in data management, analysis, and visualization challenges. This will clearly be exacerbated as we move towards exascale computations. A key observation that “big ” in big data typically refers to a naive measure of size, for example, the number of (possibly adaptively selected) grid points in a simulation, or the number of time slices. In many instances, the “complexity ” of the data, or perhaps more accurately the “complexity up to precision ”, is much smaller [3, 26, 10, 24, 12, 14, 11, 5, 6, 7]. In principle, this phenomenon should be exploited to reduce the computational cost of algorithms and aid in the data management of the simulation results. One approach is to approximate the data using lowdimensional geometric models (e.g., by lowdimensional manifolds
Subspace Projection Matrix Recovery from Incomplete Information
, 2015
"... Structural signal retrieval from highly incomplete information is a concerning problem in the field of signal and information processing. In this work, we study the recovery of subspace projection matrix from random compressed measurements, i.e., matrix sensing, and random downsamplings, i.e., matr ..."
Abstract
 Add to MetaCart
Structural signal retrieval from highly incomplete information is a concerning problem in the field of signal and information processing. In this work, we study the recovery of subspace projection matrix from random compressed measurements, i.e., matrix sensing, and random downsamplings, i.e., matrix completion, by formulating an optimization problem on the Grassmann manifold. For the sensing problem, we derive a bound on the number of Gaussian measurements O(s(N − s)), so that a restricted isometry property can hold with high probability. Such RIP condition can guarantee the unique recovery in the noiseless scenario, and the robust recovery in the noise case. As for the matrix completion problem, we obtain a bound on the sampling density of the Bernoulli model O s3/2(N − s) log3N / (N2)) for subspace projection matrices. A gradient descent algorithm on the Grassmann manifold is proposed to solve the mentioned optimization problem, and the convergence behavior of such nonconvex optimization algorithm is theoretical analyzed for the sensing and the completion problem respectively. The algorithm is numerically tested in both the sensing and the completion problems under both noiseless and noise scenarios. Theoretical results are verified, and the algorithm is compared with some other low rank matrix completion algorithms to demonstrate its good performance.
Dictionary Learning and NonAsymptotic Bounds for the Geometric MultiResolution Analysis
"... Abstract: Highdimensional data sets arising in a wide variety of applications often exhibit inherently lowdimensional structure. Detecting, measuring, and exploiting such low intrinsic dimensionality has been the focus of much research in the past decade, with implications and applications in many ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: Highdimensional data sets arising in a wide variety of applications often exhibit inherently lowdimensional structure. Detecting, measuring, and exploiting such low intrinsic dimensionality has been the focus of much research in the past decade, with implications and applications in many fields including highdimensional statistics, machine learning, and signal processing. In this vein, active and compelling research in machine learning explores the topic of manifold learning, where the lowdimensional sets manifest as an unknown manifold structure that must be learned from the sampled data. Manifold learning seems quite distinct from the comparably popular subject of dictionary learning, where the lowdimensional structure is the set of sparse (or compressible) linear combinations of vectors from a finite linear dictionary. However, Geometric MultiResolution Analysis (GMRA) [2] was introduced as a method for producing, in a robust multiscale fashion, an approximation to a lowdimensional manifold structure (should it exist), while simultaneously providing a dictionary for sparse representation of the data, thereby creating a connection between these two problems. In this work, we prove nonasymptotic probabilistic bounds for GMRA approximation error under certain assumptions on the geometry of the underlying distribution. In particular, our results imply that if the data is supported near a lowdimensional manifold, the proposed sparse representations result in an error primarily dependent upon the intrinsic dimension of the manifold, and independent of the ambient dimension. 1.