Results 1  10
of
107
Background and Foreground Modeling Using Nonparametric Kernel Density Estimation for Visual Surveillance
 PROCEEDINGS OF THE IEEE
, 2002
"... ... This paper focuses on two issues related to this problem. First, we construct a statistical representation of the scene background that supports sensitive detection of moving objects in the scene, but is robust to clutter arising out of natural scene variations. Second, we build statistical repr ..."
Abstract

Cited by 164 (7 self)
 Add to MetaCart
... This paper focuses on two issues related to this problem. First, we construct a statistical representation of the scene background that supports sensitive detection of moving objects in the scene, but is robust to clutter arising out of natural scene variations. Second, we build statistical representations of the foreground regions (moving objects) that support their tracking and support occlusion reasoning. The probability density functions (pdfs) associated with the background and foreground are likely to vary from image to image and will not in general have a known parametric form. We accordingly utilize general nonparametric kernel density estimation techniques for building these statistical representations of the background and the foreground. These techniques estimate the pdf directly from the data without any assumptions about the underlying distributions. Example results from applications are presented
Improved fast Gauss transform and efficient kernel density estimation
 In ICCV
, 2003
"... Evaluating sums of multivariate Gaussians is a common computational task in computer vision and pattern recognition, including in the general and powerful kernel density estimation technique. The quadratic computational complexity of the summation is a significant barrier to the scalability of this ..."
Abstract

Cited by 103 (7 self)
 Add to MetaCart
Evaluating sums of multivariate Gaussians is a common computational task in computer vision and pattern recognition, including in the general and powerful kernel density estimation technique. The quadratic computational complexity of the summation is a significant barrier to the scalability of this algorithm to practical applications. The fast Gauss transform (FGT) has successfully accelerated the kernel density estimation to linear running time for lowdimensional problems. Unfortunately, the cost of a direct extension of the FGT to higherdimensional problems grows exponentially with dimension, making it impractical for dimensions above 3. We develop an improved fast Gauss transform to efficiently estimate sums of Gaussians in higher dimensions, where a new multivariate expansion scheme and an adaptive space subdivision technique dramatically improve the performance. The improved FGT has been applied to the mean shift algorithm achieving linear computational complexity. Experimental results demonstrate the efficiency and effectiveness of our algorithm. 1
Penalty Methods For American Options With Stochastic Volatility
, 1998
"... The American early exercise constraint can be viewed as transforming the two dimensional stochastic volatility option pricing PDE into a differential algebraic equation (DAE). Several methods are described for forcing the algebraic constraint by using a penalty source term in the discrete equations. ..."
Abstract

Cited by 62 (18 self)
 Add to MetaCart
The American early exercise constraint can be viewed as transforming the two dimensional stochastic volatility option pricing PDE into a differential algebraic equation (DAE). Several methods are described for forcing the algebraic constraint by using a penalty source term in the discrete equations. The resulting nonlinear algebraic equations are solved using an approximate Newton iteration. The solution of the Jacobian is obtained using an incomplete LU (ILU) preconditioned PCG method. Some example computations are presented for option pricing problems based on a stochastic volatility model, including an exotic American chooser option written on a put and call with discrete double knockout barriers and discrete dividends.
Nonrigid point set registration: Coherent Point Drift (CPD)
 IN ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 19
, 2006
"... We introduce Coherent Point Drift (CPD), a novel probabilistic method for nonrigid registration of point sets. The registration is treated as a Maximum Likelihood (ML) estimation problem with motion coherence constraint over the velocity field such that one point set moves coherently to align with ..."
Abstract

Cited by 48 (0 self)
 Add to MetaCart
We introduce Coherent Point Drift (CPD), a novel probabilistic method for nonrigid registration of point sets. The registration is treated as a Maximum Likelihood (ML) estimation problem with motion coherence constraint over the velocity field such that one point set moves coherently to align with the second set. We formulate the motion coherence constraint and derive a solution of regularized ML estimation through the variational approach, which leads to an elegant kernel form. We also derive the EM algorithm for the penalized ML optimization with deterministic annealing. The CPD method simultaneously finds both the nonrigid transformation and the correspondence between two point sets without making any prior assumption of the transformation model except that of motion coherence. This method can estimate complex nonlinear nonrigid transformations, and is shown to be accurate on 2D and 3D examples and robust in the presence of outliers and missing points.
nonparametric statistical method for image segmentation using information theory and curve evolution
 IEEE Transactions on Image Processing
, 2005
"... Abstract—In this paper, we present a new informationtheoretic approach to image segmentation. We cast the segmentation problem as the maximization of the mutual information between the region labels and the image pixel intensities, subject to a constraint on the total length of the region boundarie ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
Abstract—In this paper, we present a new informationtheoretic approach to image segmentation. We cast the segmentation problem as the maximization of the mutual information between the region labels and the image pixel intensities, subject to a constraint on the total length of the region boundaries. We assume that the probability densities associated with the image pixel intensities within each region are completely unknown a priori, and we formulate the problem based on nonparametric density estimates. Due to the nonparametric structure, our method does not require the image regions to have a particular type of probability distribution and does not require the extraction and use of a particular statistic. We solve the informationtheoretic optimization problem by deriving the associated gradient flows and applying curve evolution techniques. We use levelset methods to implement the resulting evolution. The experimental results based on both synthetic and real images demonstrate that the proposed technique can solve a variety of challenging image segmentation problems. Futhermore, our method, which does not require any training, performs as good as methods based on training. Index Terms—Curve evolution, image segmentation, information theory, levelset methods, nonparametric density estimation. I.
Efficient Kernel Machines Using the Improved Fast Gauss Transform
 Advances in Neural Information Processing Systems 17
, 2004
"... The computation required for kernel machines with N training samples is O(N ). Such computational complexity is significant even for moderate size problems and is prohibitive for large datasets. We present an approximation technique based on the improved fast Gauss transform to reduce the com ..."
Abstract

Cited by 41 (6 self)
 Add to MetaCart
The computation required for kernel machines with N training samples is O(N ). Such computational complexity is significant even for moderate size problems and is prohibitive for large datasets. We present an approximation technique based on the improved fast Gauss transform to reduce the computation to O(N). We also give an error bound for the approximation, and provide experimental results on the UCI datasets.
Efficient Kernel Density Estimation using the Fast Gauss Transform with Applications to Color Modeling and Tracking
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2003
"... The study of many vision problems is reduced to the estimation of a probability density function from observations. Kernel density estimation techniques are quite general and powerful methods for this problem, but have a significant disadvantage in that they are computationally intensive. In this pa ..."
Abstract

Cited by 39 (0 self)
 Add to MetaCart
The study of many vision problems is reduced to the estimation of a probability density function from observations. Kernel density estimation techniques are quite general and powerful methods for this problem, but have a significant disadvantage in that they are computationally intensive. In this paper we explore the use of kernel density estimation with the fast gauss transform (FGT) for problems in vision. The FGT allows the summation of a mixture of M Gaussians at N evaluation points in O(M + N) timeasopposedtoO(MN)time for a naive evaluation, and can be used to considerably speed up kernel density estimation. We present applications of the technique to problems from image segmentation and tracking, and show that the algorithm allows application of advanced statistical techniques to solve practical vision problems in real time with today’s computers. 1
Efficient meanshift tracking via a new similarity measure
 in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR ’05
, 2005
"... The mean shift algorithm has achieved considerable success in object tracking due to its simplicity and robustness. It finds local minima of a similarity measure between the color histograms or kernel density estimates of the model and target image. The most typically used similarity measures are th ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
The mean shift algorithm has achieved considerable success in object tracking due to its simplicity and robustness. It finds local minima of a similarity measure between the color histograms or kernel density estimates of the model and target image. The most typically used similarity measures are the Bhattacharyya coefficient or the KullbackLeibler divergence. In practice, these approaches face three difficulties. First, the spatial information of the target is lost when the color histogram is employed, which precludes the application of more elaborate motion models. Second, the classical similarity measures are not very discriminative. Third, the samplebased classical similarity measures require a calculation that is quadratic in the number of samples, making realtime performance difficult. To deal with these difficulties we propose a new, simpletocompute and more discriminative similarity measure in spatialfeature spaces. The new similarity measure allows the mean shift algorithm to track more general motion models in an integrated way. To reduce the complexity of the computation to linear order we employ the recently proposed improved fast Gauss transform. This leads to a very efficient and robust nonparametric spatialfeature tracking algorithm. The algorithm is tested on several image sequences and shown to achieve robust and reliable framerate tracking.
Robust Numerical Methods for Contingent Claims under Jump Diffusion Processes
 IMA Journal of Numerical Analysis
, 2003
"... An implicit method is developed for the numerical solution of option pricing models where it is assumed that the underlying process is a jump diffusion. This method can be applied to a variety of contingent claim valuations, including American options, various kinds of exotic options, and models wit ..."
Abstract

Cited by 32 (13 self)
 Add to MetaCart
An implicit method is developed for the numerical solution of option pricing models where it is assumed that the underlying process is a jump diffusion. This method can be applied to a variety of contingent claim valuations, including American options, various kinds of exotic options, and models with uncertain volatility or transaction costs. Proofs of timestepping stability and convergence of a fixed point iteration scheme are presented. For typical model parameters, it is shown that the fixed point iteration reduces the error by two orders of magnitude at each iteration. The correlation integral is computed using a fast Fourier transform (FFT) method. Techniques are developed for avoiding wraparound effects. Numerical tests of convergence for a variety of options are presented.
A short course on fast multipole methods
 Wavelets, Multilevel Methods and Elliptic PDEs
, 1997
"... In this series of lectures, we describe the analytic and computational foundations of fast multipole methods, as well as some of their applications. They are most easily understood, perhaps, in the case of particle simulations, where they reduce the cost of computing all pairwise interactions in a s ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
In this series of lectures, we describe the analytic and computational foundations of fast multipole methods, as well as some of their applications. They are most easily understood, perhaps, in the case of particle simulations, where they reduce the cost of computing all pairwise interactions in a system of N particles from O(N 2)toO(N)orO(N log N) operations. They are equally useful, however, in solving certain partial differential equations by first recasting them as integral equations. We will draw heavily from the existing literature, especially Greengard [23, 24, 25]; Greengard and Rokhlin [29, 32]; Greengard and Strain [34].