Results 1 
4 of
4
Density estimation from unweighted knearest neighbor graphs: A roadmap
 in Proc. Advances Neural Information Processing Systems, 2013
"... Consider an unweighted knearest neighbor graph on n points that have been sampled i.i.d. from some unknown density p on Rd. We prove how one can estimate the density p just from the unweighted adjacency matrix of the graph, without knowing the points themselves or any distance or similarity scores ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Consider an unweighted knearest neighbor graph on n points that have been sampled i.i.d. from some unknown density p on Rd. We prove how one can estimate the density p just from the unweighted adjacency matrix of the graph, without knowing the points themselves or any distance or similarity scores. The key insights are that local differences in link numbers can be used to estimate a local function of the gradient of p, and that integrating this function along shortest paths leads to an estimate of the underlying density. 1
Geodesic Exponential Kernels: When Curvature and Linearity Conflict
, 2014
"... We consider kernel methods on general geodesic metric spaces and provide both negative and positive results. First we show that the common Gaussian kernel can only be generalized to a positive definite kernel on a geodesic metric space if the space is flat. As a result, for data on a Riemannian mani ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We consider kernel methods on general geodesic metric spaces and provide both negative and positive results. First we show that the common Gaussian kernel can only be generalized to a positive definite kernel on a geodesic metric space if the space is flat. As a result, for data on a Riemannian manifold, the geodesic Gaussian kernel is only positive definite if the Riemannian manifold is Euclidean. This implies that any attempt to design geodesic Gaussian kernels on curved Riemannian manifolds is futile. However, we show that for spaces with conditionally negative definite distances the geodesic Laplacian kernel can be generalized while retaining positive definiteness. This implies that geodesic Laplacian kernels can be generalized to some curved spaces, including spheres and hyperbolic spaces. Our theoretical results are verified empirically. 1
Enhanced Mode Clustering
"... Mode clustering is a nonparametric method for clustering that defines clusters as the basins of attraction of a density estimator’s modes. We provide several enhancements to mode clustering: (i) a “soft ” cluster assignment, (ii) a measure of connectivity between clusters, and (iii) an approach to v ..."
Abstract
 Add to MetaCart
(Show Context)
Mode clustering is a nonparametric method for clustering that defines clusters as the basins of attraction of a density estimator’s modes. We provide several enhancements to mode clustering: (i) a “soft ” cluster assignment, (ii) a measure of connectivity between clusters, and (iii) an approach to visualizing the clusters.
Densitypreserving quantization with application to graph downsampling
"... We consider the problem of vector quantization of i.i.d. samples drawn from a density p on R d. It is desirable that the representatives selected by the quantization algorithm have the same distribution p as the original sample points. However, quantization algorithms based on Euclidean distance, su ..."
Abstract
 Add to MetaCart
We consider the problem of vector quantization of i.i.d. samples drawn from a density p on R d. It is desirable that the representatives selected by the quantization algorithm have the same distribution p as the original sample points. However, quantization algorithms based on Euclidean distance, such as kmeans, do not have this property. We provide a solution to this problem that takes the unweighted knearest neighbor graph on the sample as input. In particular, it does not need to have access to the data points themselves. Our solution generates quantization centers that are “evenly spaced”. We exploit this property to downsample geometric graphs and show that our method produces sparse downsampled graphs. Our algorithm is easy to implement, and we provide theoretical guarantees on the performance of the proposed algorithm.