## Improved fast Gauss transform and efficient kernel density estimation (2003)

### Cached

### Download Links

- [www.umiacs.umd.edu]
- [www.umiacs.umd.edu]
- [lear.inrialpes.fr]
- DBLP

### Other Repositories/Bibliography

Venue: | In ICCV |

Citations: | 112 - 7 self |

### BibTeX

@INPROCEEDINGS{Yang03improvedfast,

author = {Changjiang Yang and Ramani Duraiswami and Nail A. Gumerov and Larry Davis},

title = {Improved fast Gauss transform and efficient kernel density estimation},

booktitle = {In ICCV},

year = {2003},

pages = {464--471}

}

### Years of Citing Articles

### OpenURL

### Abstract

Evaluating sums of multivariate Gaussians is a common computational task in computer vision and pattern recognition, including in the general and powerful kernel density estimation technique. The quadratic computational complexity of the summation is a significant barrier to the scalability of this algorithm to practical applications. The fast Gauss transform (FGT) has successfully accelerated the kernel density estimation to linear running time for lowdimensional problems. Unfortunately, the cost of a direct extension of the FGT to higher-dimensional problems grows exponentially with dimension, making it impractical for dimensions above 3. We develop an improved fast Gauss transform to efficiently estimate sums of Gaussians in higher dimensions, where a new multivariate expansion scheme and an adaptive space subdivision technique dramatically improve the performance. The improved FGT has been applied to the mean shift algorithm achieving linear computational complexity. Experimental results demonstrate the efficiency and effectiveness of our algorithm. 1

### Citations

2756 | Normalized cuts and image segmentation
- Shi, Malik
- 2000
(Show Context)
Citation Context ...idth and a better density estimate. Many approaches in computer vision and pattern recognition use kernel density estimation, including support vector machines [23], M-estimation [18], normalized cut =-=[24]-=- and mean shift analysis [5]. With enough samples, the kernel density estimates provably converge to any arbitrary density function. On the other hand, the number of samples needed may be very large a... |

2096 |
Pattern classification
- Duda, Hart, et al.
- 2000
(Show Context)
Citation Context ...tiveness of our algorithm. 1 Introduction In most computer vision and pattern recognition applications, the feature space is complex, noisy and rarely can be described by the common parametric models =-=[7]-=-, since the forms of the underlying density functions are in general unknown. In particular, data in high-dimensional feature space becomes more sparse and scattered, making it much more difficult to ... |

1607 | shift: a robust approach toward feature space analysis
- Comaniciu, “Mean
- 2002
(Show Context)
Citation Context ...imate. Many approaches in computer vision and pattern recognition use kernel density estimation, including support vector machines [23], M-estimation [18], normalized cut [24] and mean shift analysis =-=[5]-=-. With enough samples, the kernel density estimates provably converge to any arbitrary density function. On the other hand, the number of samples needed may be very large and much greater than would b... |

782 | A Fast Algorithm for Particle Simulations
- Greengard, Rokhlin
- 1987
(Show Context)
Citation Context ...ced the computational complexity into linear time. 2 FMM and FGT The fast Gauss transform introduced by Greengard and Strain [15, 26] is an important variant of the more general fast multipole method =-=[13, 16]-=-. Originally the FMM was developed for the fast summation of potential fields generated by a large number of sources, such as those arising in gravitational or electrostatic potential problems in two ... |

422 | Mean shift, mode seeking, and clustering
- Cheng
- 1995
(Show Context)
Citation Context ... many applications such as in video sequence analysis and eigenspace based approaches. We also show how the IFGT can be applied to the kernel density estimation. Specifically the mean shift algorithm =-=[11, 4, 5]-=- is chosen as a case study for the IFGT. The mean shift algorithm is based on the KDE and recently rediscovered as a robust clustering method. However, the mean shift algorithm suffers from the quadra... |

324 |
The estimation of the gradient of a density function, with applications in pattern recognition
- Fukunaga, Hostetler
- 1975
(Show Context)
Citation Context ... many applications such as in video sequence analysis and eigenspace based approaches. We also show how the IFGT can be applied to the kernel density estimation. Specifically the mean shift algorithm =-=[11, 4, 5]-=- is chosen as a case study for the IFGT. The mean shift algorithm is based on the KDE and recently rediscovered as a robust clustering method. However, the mean shift algorithm suffers from the quadra... |

316 |
Remarks on some nonparametric estimates of a density function
- Rosenblatt
- 1956
(Show Context)
Citation Context ...icult to fit them with a single high-dimensional density function. By contrast, without the assumption that the form of the underlying densities are known, nonparametric density estimation techniques =-=[22, 20]-=- have been widely used to analyze arbitrarily structured feature spaces. The most widely studied and used nonparametric technique is kernel density estimation (KDE), first introduced by Rosenblatt [22... |

297 |
Clustering to minimize the maximum intercluster distance
- Gonzalez
- 1985
(Show Context)
Citation Context ...,...,Sk, and also the cluster centers c1,...,ck, soasto minimize the cost function — the maximum radius of clusters: max i max �v − ci�. v∈Si The k-center problem is known to be NP-hard [2].=-= Gonzalez [12]-=- proposed a very simple greedy algorithm, called farthest-point clustering. Initially pick an arbitrary point v0 as the center of the first cluster and add it to the center set C. Then for i =1to k do... |

245 |
On estimation of a probability density function and
- Parzen
- 1065
(Show Context)
Citation Context ...icult to fit them with a single high-dimensional density function. By contrast, without the assumption that the form of the underlying densities are known, nonparametric density estimation techniques =-=[22, 20]-=- have been widely used to analyze arbitrarily structured feature spaces. The most widely studied and used nonparametric technique is kernel density estimation (KDE), first introduced by Rosenblatt [22... |

196 |
D.B.Shmoys, A best possible heuristic for the k-center problem
- Hochbaum
- 1985
(Show Context)
Citation Context ...ation algorithm which computes a partition with maximum radius at most twice the optimum. The proof uses no geometry beyond the triangle inequity, so it hold for any metric space. Hochbaum and Shmoys =-=[17]-=- proved that the factor 2 cannot be improved unless P = NP.The direct implementation of farthest-point clustering has running time O(nk). Feder and Greene [9] give a two-phase algorithm with optimal r... |

173 |
Optimal algorithms for approximate clustering
- Feder, Greene
- 1988
(Show Context)
Citation Context ...or any metric space. Hochbaum and Shmoys [17] proved that the factor 2 cannot be improved unless P = NP.The direct implementation of farthest-point clustering has running time O(nk). Feder and Greene =-=[9]-=- give a two-phase algorithm with optimal running time O(n log k). The predefined number of clusters k can be determined as follows: run the farthest-point algorithm until the maximum radius of cluster... |

140 |
The fast Gauss transform
- Greengard, Strain
- 1991
(Show Context)
Citation Context ...e fast multipole method (FMM) and fast Gauss transform (FGT) have been used to reduce the computational time of kernel density estimation to O(M + N) time, where the data are not necessarily on grids =-=[15, 8]-=-. As faster computers and better video cameras become cheaper, the collection of sufficient data is becoming possible, which results in a steady increase in the size of the dataset and the number of t... |

80 | Approximation algorithms for geometric problems
- Bern, Eppstein
(Show Context)
Citation Context ...to clusters S1,...,Sk, and also the cluster centers c1,...,ck, soasto minimize the cost function — the maximum radius of clusters: max i max �v − ci�. v∈Si The k-center problem is known to b=-=e NP-hard [2]-=-. Gonzalez [12] proposed a very simple greedy algorithm, called farthest-point clustering. Initially pick an arbitrary point v0 as the center of the first cluster and add it to the center set C. Then ... |

58 |
Learning with Kernels: Support Vector Machines, Regularization, Optimization and Beyond
- Schlkopf, Smola
- 2002
(Show Context)
Citation Context ... more data points allow a narrower bandwidth and a better density estimate. Many approaches in computer vision and pattern recognition use kernel density estimation, including support vector machines =-=[23]-=-, M-estimation [18], normalized cut [24] and mean shift analysis [5]. With enough samples, the kernel density estimates provably converge to any arbitrary density function. On the other hand, the numb... |

54 |
Estimation of multivariate density
- Cacoullos
- 1966
(Show Context)
Citation Context ...feature spaces. The most widely studied and used nonparametric technique is kernel density estimation (KDE), first introduced by Rosenblatt [22], then discussed in detail by Parzen [20] and Cacoullos =-=[3]-=-. In this technique the density function is estimated by a sum of kernel functions (typically GausProceedings of the Ninth IEEE International Conference on Computer Vision (ICCV 2003) 2-Volume Set 0-7... |

43 |
Robust statistical procedures
- Huber
- 1997
(Show Context)
Citation Context ...llow a narrower bandwidth and a better density estimate. Many approaches in computer vision and pattern recognition use kernel density estimation, including support vector machines [23], M-estimation =-=[18]-=-, normalized cut [24] and mean shift analysis [5]. With enough samples, the kernel density estimates provably converge to any arbitrary density function. On the other hand, the number of samples neede... |

29 | Efficient non-parametric adaptive color modeling using fast gauss transform
- Elgammal, Duraiswami, et al.
(Show Context)
Citation Context ...e fast multipole method (FMM) and fast Gauss transform (FGT) have been used to reduce the computational time of kernel density estimation to O(M + N) time, where the data are not necessarily on grids =-=[15, 8]-=-. As faster computers and better video cameras become cheaper, the collection of sufficient data is becoming possible, which results in a steady increase in the size of the dataset and the number of t... |

29 |
The reduced Parzen classifier
- Fukunaga, Hayes
- 1989
(Show Context)
Citation Context ...es can be roughly divided into two categories. One is based on the k-nearest neighbor searching, where spatial data structures and/or branch and bound are employed to achieve the computational saving =-=[21, 6, 10, 19]-=-. One is based on the fast Fourier transform (FFT) for evaluating density estimates on gridded data which, however, are unavailable for most applications [25]. Recently the fast multipole method (FMM)... |

24 |
A new error estimate of the fast gauss transform
- Baxter, Roussos
(Show Context)
Citation Context ...a Taylor series is achieved via a translation operation. The error bound estimate given by Greengard and Strain [15] is incorrect, and a new and more complicated error bound estimate was presented in =-=[1]-=-. The extension to higher dimensions was done by treating the multivariate Gaussian as a product of univariate Gaussians, applying the series factorizations (7) and (8) to each dimension. For convenie... |

21 |
A Fast Algorithm for the Evaluation of Heat Potentials
- Greengard, Strain
- 1990
(Show Context)
Citation Context ...this defect in higher dimensions, the FGT is quite effective for two and three-dimensional problems, and has already achieved success in some physics, computer vision and pattern recognition problems =-=[14, 8]-=-. Another serious defect of the original FGT is the use of the box data structure. The original FGT subdivides the space into boxes using a uniform mesh. However, such a simple space subdivision schem... |

20 |
The fast Gauss transform with variable scales
- Strain
- 1991
(Show Context)
Citation Context ...exity, especially in higher dimensions. The proposed IFGT successfully reduced the computational complexity into linear time. 2 FMM and FGT The fast Gauss transform introduced by Greengard and Strain =-=[15, 26]-=- is an important variant of the more general fast multipole method [13, 16]. Originally the FMM was developed for the fast summation of potential fields generated by a large number of sources, such as... |

19 | Mean-shift analysis using quasi-newton methods
- Yang, Duraiswami, et al.
(Show Context)
Citation Context ...ry results. In future work, we will study the capability of the IFGT in applications such as learning kernel classifiers (SVM), object tracking. We also plan to combine the IFGT with other techniques =-=[27]-=- to further improve the performance of mean shift algorithm. Appendix: Error Bound of Improved FGT The error due to the truncation of series (19) after order p and the cutoff error the satisfies the b... |

15 | Data Structures, Optimal Choice of Parameters, and Complexity Results for Generalized Multilevel Fast Multipole Methods in dDimensions - Gumerov, Duraiswami, et al. - 2003 |

14 |
Algorithm as 176: Kernel density estimation using the fast fourier transform
- Silverman
- 1982
(Show Context)
Citation Context ...hieve the computational saving [21, 6, 10, 19]. One is based on the fast Fourier transform (FFT) for evaluating density estimates on gridded data which, however, are unavailable for most applications =-=[25]-=-. Recently the fast multipole method (FMM) and fast Gauss transform (FGT) have been used to reduce the computational time of kernel density estimation to O(M + N) time, where the data are not necessar... |

13 |
Fast Parzen density estimation using clustering-based branch and bound
- Jeon, Landgrebe
- 1994
(Show Context)
Citation Context ...es can be roughly divided into two categories. One is based on the k-nearest neighbor searching, where spatial data structures and/or branch and bound are employed to achieve the computational saving =-=[21, 6, 10, 19]-=-. One is based on the fast Fourier transform (FFT) for evaluating density estimates on gridded data which, however, are unavailable for most applications [25]. Recently the fast multipole method (FMM)... |

5 |
A fast algorithm for nonparametric probability density estimation
- Postaire, Vasseur
- 1982
(Show Context)
Citation Context ...es can be roughly divided into two categories. One is based on the k-nearest neighbor searching, where spatial data structures and/or branch and bound are employed to achieve the computational saving =-=[21, 6, 10, 19]-=-. One is based on the fast Fourier transform (FFT) for evaluating density estimates on gridded data which, however, are unavailable for most applications [25]. Recently the fast multipole method (FMM)... |

4 | Data structures in kernel density estimation - Devroye, Machell - 1985 |