Results 1  10
of
1,369,030
Diffusion kernels on graphs and other discrete input spaces
 in: Proceedings of the 19th International Conference on Machine Learning
, 2002
"... The application of kernelbased learning algorithms has, so far, largely been confined to realvalued data and a few special data types, such as strings. In this paper we propose a general method of constructing natural families of kernels over discrete structures, based on the matrix exponentiation ..."
Abstract

Cited by 225 (7 self)
 Add to MetaCart
idea. In particular, we focus on generating kernels on graphs, for which we propose a special class of exponential kernels called diffusion kernels, which are based on the heat equation and can be regarded as the discretization of the familiar Gaussian kernel of Euclidean space.
Partitioning Input Space for ControlLearning
"... This paper considers the effect of inputspace partitioning on reinforcement learning for control. In many such learning systems, the input space is partitioned by the system designer. However, inputspace partitioning could be learned. Our objective is to compare learned and programmed inputspace ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper considers the effect of inputspace partitioning on reinforcement learning for control. In many such learning systems, the input space is partitioned by the system designer. However, inputspace partitioning could be learned. Our objective is to compare learned and programmed inputspace
Input Space Versus Feature Space in KernelBased Methods
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1999
"... This paper collects some ideas targeted at advancing our understanding of the feature spaces associated with support vector (SV) kernel functions. We first discuss the geometry of feature space. In particular, we review what is known about the shape of the image of input space under the feature spac ..."
Abstract

Cited by 132 (5 self)
 Add to MetaCart
This paper collects some ideas targeted at advancing our understanding of the feature spaces associated with support vector (SV) kernel functions. We first discuss the geometry of feature space. In particular, we review what is known about the shape of the image of input space under the feature
Active Learning in Discrete Input Spaces
 In Proceedings of the 34th Interface Symposium
, 2002
"... Traditional design of experiments (DOE) from the statistics literature focuses on optimizing an output parameter over a space of continuous input parameters. Here we consider DOE, or active learning, for discrete input spaces. A trivial example of this is the karmed bandit problem, which is the ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Traditional design of experiments (DOE) from the statistics literature focuses on optimizing an output parameter over a space of continuous input parameters. Here we consider DOE, or active learning, for discrete input spaces. A trivial example of this is the karmed bandit problem, which
Actions as spacetime shapes
 In ICCV
, 2005
"... Human action in video sequences can be seen as silhouettes of a moving torso and protruding limbs undergoing articulated motion. We regard human actions as threedimensional shapes induced by the silhouettes in the spacetime volume. We adopt a recent approach [14] for analyzing 2D shapes and genera ..."
Abstract

Cited by 642 (4 self)
 Add to MetaCart
Human action in video sequences can be seen as silhouettes of a moving torso and protruding limbs undergoing articulated motion. We regard human actions as threedimensional shapes induced by the silhouettes in the spacetime volume. We adopt a recent approach [14] for analyzing 2D shapes
A theory of shape by space carving
 In Proceedings of the 7th IEEE International Conference on Computer Vision (ICCV99), volume I, pages 307– 314, Los Alamitos, CA
, 1999
"... In this paper we consider the problem of computing the 3D shape of an unknown, arbitrarilyshaped scene from multiple photographs taken at known but arbitrarilydistributed viewpoints. By studying the equivalence class of all 3D shapes that reproduce the input photographs, we prove the existence of a ..."
Abstract

Cited by 574 (14 self)
 Add to MetaCart
In this paper we consider the problem of computing the 3D shape of an unknown, arbitrarilyshaped scene from multiple photographs taken at known but arbitrarilydistributed viewpoints. By studying the equivalence class of all 3D shapes that reproduce the input photographs, we prove the existence
Fisher Discriminant Analysis With Kernels
, 1999
"... A nonlinear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) nonlinear decision f ..."
Abstract

Cited by 493 (18 self)
 Add to MetaCart
function in input space. Large scale simulations demonstrate the competitiveness of our approach.
The space complexity of approximating the frequency moments
 JOURNAL OF COMPUTER AND SYSTEM SCIENCES
, 1996
"... The frequency moments of a sequence containing mi elements of type i, for 1 ≤ i ≤ n, are the numbers Fk = �n i=1 mki. We consider the space complexity of randomized algorithms that approximate the numbers Fk, when the elements of the sequence are given one by one and cannot be stored. Surprisingly, ..."
Abstract

Cited by 855 (12 self)
 Add to MetaCart
The frequency moments of a sequence containing mi elements of type i, for 1 ≤ i ≤ n, are the numbers Fk = �n i=1 mki. We consider the space complexity of randomized algorithms that approximate the numbers Fk, when the elements of the sequence are given one by one and cannot be stored. Surprisingly
Nonlinear component analysis as a kernel eigenvalue problem

, 1996
"... We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all ..."
Abstract

Cited by 1554 (85 self)
 Add to MetaCart
We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all
Learning the Kernel Matrix with SemiDefinite Programming
, 2002
"... Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information ..."
Abstract

Cited by 780 (22 self)
 Add to MetaCart
is contained in the socalled kernel matrix, a symmetric and positive definite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input spaceclassical model selection
Results 1  10
of
1,369,030