Results 1 
6 of
6
Supervised source localization using diffusion kernels
 In IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, 245 –248 (, New Paltz
, 2011
"... Recently, we introduced a method to recover the controlling parameters of linear systems using diffusion kernels. In this paper, we apply our approach to the problem of source localization in a reverberant room using measurements from a single microphone. Prior recordings of signals from various k ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
Recently, we introduced a method to recover the controlling parameters of linear systems using diffusion kernels. In this paper, we apply our approach to the problem of source localization in a reverberant room using measurements from a single microphone. Prior recordings of signals from various known locations in the room are required for training and calibration. The proposed algorithm relies on a computation of a diffusion kernel with a speciallytailored distance measure. Experimental results in a real reverberant environment demonstrate accurate recovery of the source location. Index Terms — Source localization, acoustic localization, diffusion geometry, diffusion kernel, manifold learning
Differential Stochastic Sensing: Intrinsic Modeling of Random Time Series with Applications to Nonlinear Tracking
, 2012
"... Many natural and artificial highdimensional time series are often controlled by a set of lowerdimensional independent factors. In this paper anisotropic diffusion is combined with local dynamical models to provide intrinsic global modeling that reveals these factors. The obtained model is shown to ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Many natural and artificial highdimensional time series are often controlled by a set of lowerdimensional independent factors. In this paper anisotropic diffusion is combined with local dynamical models to provide intrinsic global modeling that reveals these factors. The obtained model is shown to be invariant to the measuring equipment and can be efficiently extended. These two properties are paramount for sequential processing and provide a foundation for probabilistic analysis. The widely applicable approach is demonstrated on nonlinear tracking problems based on both simulated and recorded data.
unknown title
, 2012
"... In many fields including economics, collection of time series such as stocks or energy prices are governed by a similar nonlinear dynamical process. These time series are often measured hourly, thus, each day can be viewed as a highdimensional data point. In this paper, we apply a spectral method, ..."
Abstract
 Add to MetaCart
(Show Context)
In many fields including economics, collection of time series such as stocks or energy prices are governed by a similar nonlinear dynamical process. These time series are often measured hourly, thus, each day can be viewed as a highdimensional data point. In this paper, we apply a spectral method, which based on anisotropic diffusion kernels to the model high dimensional electricity price data. We demonstrate the proposed method on price data that was collected from several zones. We show that even though the observed output spaces differ by local spatial influences and noise, the common global parameters that drive the underlying process are extracted. Modeling zonal electricity prices by anisotropic diffusion embeddings
ApproximatelyIsometric Diffusion Maps
"... Diffusion Maps (DM), and other kernel methods, are utilized for the analysis of high dimensional datasets. The DM method uses a Markovian diffusion process to model and analyze data. A spectral analysis of the DM kernel yields a map of the data into a low dimensional space, where Euclidean distances ..."
Abstract
 Add to MetaCart
(Show Context)
Diffusion Maps (DM), and other kernel methods, are utilized for the analysis of high dimensional datasets. The DM method uses a Markovian diffusion process to model and analyze data. A spectral analysis of the DM kernel yields a map of the data into a low dimensional space, where Euclidean distances between the mapped data points represent the diffusion distances between the corresponding high dimensional data points. Many machine learning methods, which are based on the Euclidean metric, can be applied to the mapped data points in order to take advantage of the diffusion relations between them. However, a significant drawback of the DM is the need to apply spectral decomposition to a kernel matrix, which becomes infeasible for large datasets. In this paper, we present an efficient approximation of the DM embedding. The presented approximation algorithm produces a dictionary of data points by identifying a small set of informative representatives. Then, based on this dictionary, the entire dataset is efficiently embedded into a low dimensional space. The Euclidean distances in the resulting embedded space approximate the diffusion distances. The properties of the presented embedding and its relation to DM method are analyzed and demonstrated.
Empirical Intrinsic Modeling of Signals and Information Geometry
, 2012
"... In many natural and realworld applications, the measured signals are controlled by underlying processes or drivers. As a result, these signals exhibit highly redundant representations and their temporal evolution can be compactly described by a dynamical process on a lowdimensional manifold. In thi ..."
Abstract
 Add to MetaCart
(Show Context)
In many natural and realworld applications, the measured signals are controlled by underlying processes or drivers. As a result, these signals exhibit highly redundant representations and their temporal evolution can be compactly described by a dynamical process on a lowdimensional manifold. In this paper, we propose a graphbased method for revealing the lowdimensional manifold and inferring the underlying process. This method provides intrinsic modeling for signals using empirical information geometry. We construct an intrinsic representation of the underlying parametric manifold from noisy measurements based on local density estimates. This construction is shown to be equivalent to an inverse problem, which is formulated as a nonlinear differential equation and is solved empirically through eigenvectors of an appropriate Laplace operator. The learned intrinsic nonlinear model exhibits two important properties. We show that it is invariant under different observation and instrumental modalities and is noise resilient. In addition, the learned model can be efficiently extended to newly acquired measurements in a sequential manner. We examine our method on two nonlinear filtering applications: a nonlinear and nonGaussian tracking problem and a nonstationary hidden Markov chain scheme. The experimental results demonstrate the power of our theory by extracting the underlying processes, which were measured through different nonlinear instrumental conditions.
EXPLOITING DATADEPENDENT STRUCTURE FOR IMPROVING SENSOR ACQUISITION AND INTEGRATION
, 2014
"... This thesis deals with two approaches to building efficient representations of data. The first is a study of compressive sensing and improved data acquisition. We outline the development of the theory, and proceed into its uses in matrix completion problems via convex optimization. The aim of this ..."
Abstract
 Add to MetaCart
(Show Context)
This thesis deals with two approaches to building efficient representations of data. The first is a study of compressive sensing and improved data acquisition. We outline the development of the theory, and proceed into its uses in matrix completion problems via convex optimization. The aim of this research is to prove that a general class of measurement operators, bounded norm Parseval frames, satisfy the necessary conditions for random subsampling and reconstruction. We then demonstrate an example of this theory in solving 2dimensional Fredholm integrals with partial measurements. This has large ramifications in improved acquisition of nuclear magnetic resonance spectra, for which we give several examples. The second part of this thesis studies the Laplacian Eigenmaps (LE) algorithm and its uses in data fusion. In particular, we build a natural approximate inversion algorithm for LE embeddings using L1 regularization and MDS embedding techniques. We show how this inversion, combined with feature space rotation, leads to a novel form of data reconstruction and inpainting using a priori information. We demonstrate this method on hyperspectral imagery and LIDAR. We also aim to understand and characterize the embeddings the LE algorithm gives. To this end, we characterize the order in which eigenvectors of a disjoint graph emerge and the support of those eigenvectors. We then extend this characterization to weakly connected graphs with clusters of differing sizes, utilizing the theory of invariant subspace perturbations and proving some novel results.