Results 1 - 10
of
362
Classification using Intersection Kernel Support Vector Machines is Efficient ∗
"... Straightforward classification using kernelized SVMs requires evaluating the kernel for a test vector and each of the support vectors. For a class of kernels we show that one can do this much more efficiently. In particular we show that one can build histogram intersection kernel SVMs (IKSVMs) with ..."
Abstract
-
Cited by 256 (10 self)
- Add to MetaCart
Straightforward classification using kernelized SVMs requires evaluating the kernel for a test vector and each of the support vectors. For a class of kernels we show that one can do this much more efficiently. In particular we show that one can build histogram intersection kernel SVMs (IKSVMs
N.: Generalized Histogram Intersection Kernel for image recognition
- In IEEE International Conference on Image Processing
, 2005
"... promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted ..."
Abstract
-
Cited by 19 (0 self)
- Add to MetaCart
promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted
SHARMA AND JURIE: EFFICIENT SVM WITH HISTOGRAM INTERSECTION KERNEL 1 A Novel Approach for Efficient SVM Classification with Histogram Intersection Kernel
"... The kernel trick – commonly used in machine learning and computer vision – enables learning of non-linear decision functions without having to explicitly map the original data to a high dimensional space. However, at test time, it requires evaluating the kernel with each one of the support vectors, ..."
Abstract
- Add to MetaCart
, which is time consuming. In this paper, we propose a novel approach for learning non-linear SVM corresponding to the histogram intersection kernel without using the kernel trick. We formulate the exact non-linear problem in the original space and show how to perform classification directly in this space
Efficient Semantic Segmentation with Gaussian Processes and Histogram Intersection Kernels
"... Semantic interpretation and understanding of im-ages is an important goal of visual recognition research and offers a large variety of possible applications. One step towards this goal is semantic segmentation, which aims for automatic labeling of image regions and pixels with category names. Since ..."
Abstract
- Add to MetaCart
usual images contain sev-eral millions of pixel, the use of kernel-based methods for the task of semantic segmentation is limited due to the involved computation times. In this paper, we over-come this drawback by exploiting efficient kernel calcu-lations using the histogram intersection kernel for fast
The pyramid match kernel: Discriminative classification with sets of image features
- IN ICCV
, 2005
"... Discriminative learning is challenging when examples are sets of features, and the sets vary in cardinality and lack any sort of meaningful ordering. Kernel-based classification methods can learn complex decision boundaries, but a kernel over unordered set inputs must somehow solve for correspondenc ..."
Abstract
-
Cited by 544 (29 self)
- Add to MetaCart
for correspondences – generally a computationally expensive task that becomes impractical for large set sizes. We present a new fast kernel function which maps unordered feature sets to multi-resolution histograms and computes a weighted histogram intersection in this space. This “pyramid match” computation is linear
Rapid uncertainty computation with gaussian processes and histogram intersection kernels
- In: ACCV
, 2012
"... Abstract. An important advantage of Gaussian processes is the ability to directly estimate classification uncertainties in a Bayesian manner. In this paper, we develop techniques that allow for estimating these un-certainties with a runtime linear or even constant with respect to the number of train ..."
Abstract
-
Cited by 6 (5 self)
- Add to MetaCart
Abstract. An important advantage of Gaussian processes is the ability to directly estimate classification uncertainties in a Bayesian manner. In this paper, we develop techniques that allow for estimating these un-certainties with a runtime linear or even constant with respect to the number of training examples. Our approach makes use of all training data without any sparse approximation technique while needing only a linear amount of memory. To incorporate new information over time, we further derive online learning methods leading to significant speed-ups and allowing for hyperparameter optimization on-the-fly. We conduct several experiments on public image datasets for the tasks of one-class classification and active learning, where computing the uncertainty is an essential task. The experimental results highlight that we are able to compute classification uncertainties within microseconds even for large-scale datasets with tens of thousands of training examples. 1
Learning Image Similarity from Flickr Groups Using Stochastic Intersection Kernel Machines
"... Measuring image similarity is a central topic in computer vision. In this paper, we learn similarity from Flickr groups and use it to organize photos. Two images are similar if they are likely to belong to the same Flickr groups. Our approach is enabled by a fast Stochastic Intersection Kernel MAchi ..."
Abstract
-
Cited by 38 (1 self)
- Add to MetaCart
Measuring image similarity is a central topic in computer vision. In this paper, we learn similarity from Flickr groups and use it to organize photos. Two images are similar if they are likely to belong to the same Flickr groups. Our approach is enabled by a fast Stochastic Intersection Kernel
Beyond the Euclidean distance: Creating effective visual codebooks using the histogram intersection kernel
"... Common visual codebook generation methods used in a Bag of Visual words model, e.g. k-means or Gaussian Mixture Model, use the Euclidean distance to cluster features into visual code words. However, most popular visual descriptors are histograms of image measurements. It has been shown that the Hist ..."
Abstract
-
Cited by 42 (3 self)
- Add to MetaCart
that the Histogram Intersection Kernel (HIK) is more effective than the Euclidean distance in supervised learning tasks with histogram features. In this paper, we demonstrate that HIK can also be used in an unsupervised manner to significantly improve the generation of visual codebooks. We propose a histogram kernel
Efficient Additive Kernels via Explicit Feature Maps
"... Maji and Berg [13] have recently introduced an explicit feature map approximating the intersection kernel. This enables efficient learning methods for linear kernels to be applied to the non-linear intersection kernel, expanding the applicability of this model to much larger problems. In this paper ..."
Abstract
-
Cited by 245 (9 self)
- Add to MetaCart
Maji and Berg [13] have recently introduced an explicit feature map approximating the intersection kernel. This enables efficient learning methods for linear kernels to be applied to the non-linear intersection kernel, expanding the applicability of this model to much larger problems. In this paper
On the Kernel of Intersecting Families
- GRAPHS AND COMBINATORICS
, 1987
"... Let ~ be a t-wise s-intersecting family, i.e., [F a N" " N F t I> s holds for every t members of ~¢~. Then there exists a set Y such that I Fx N.'- N F, N Y [ _> _ s still holds for every Fl..... F, e ~. Here exponential lower and upper bounds are proven for the possible sizes ..."
Abstract
-
Cited by 12 (2 self)
- Add to MetaCart
Let ~ be a t-wise s-intersecting family, i.e., [F a N" " N F t I> s holds for every t members of ~¢~. Then there exists a set Y such that I Fx N.'- N F, N Y [ _> _ s still holds for every Fl..... F, e ~. Here exponential lower and upper bounds are proven for the possible sizes
Results 1 - 10
of
362