Results 1 - 10
of
414
A Reproducing Kernel Hilbert Space Framework for Information-Theoretic Learning
"... Abstract—This paper provides a functional analysis perspective of information-theoretic learning (ITL) by defining bottom-up a reproducing kernel Hilbert space (RKHS) uniquely determined by the symmetric nonnegative definite kernel function known as the cross-information potential (CIP). The CIP as ..."
Abstract
-
Cited by 13 (8 self)
- Add to MetaCart
Abstract—This paper provides a functional analysis perspective of information-theoretic learning (ITL) by defining bottom-up a reproducing kernel Hilbert space (RKHS) uniquely determined by the symmetric nonnegative definite kernel function known as the cross-information potential (CIP). The CIP
Online Learning with Kernels
, 2003
"... Kernel based algorithms such as support vector machines have achieved considerable success in various problems in the batch setting where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little u ..."
Abstract
-
Cited by 2831 (123 self)
- Add to MetaCart
use of these methods in an online setting suitable for real-time applications. In this paper we consider online learning in a Reproducing Kernel Hilbert Space. By considering classical stochastic gradient descent within a feature space, and the use of some straightforward tricks, we develop simple
Manifold regularization: A geometric framework for learning from labeled and unlabeled examples
- JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner. Some transductive graph learning al ..."
Abstract
-
Cited by 578 (16 self)
- Add to MetaCart
algorithms and standard methods including Support Vector Machines and Regularized Least Squares can be obtained as special cases. We utilize properties of Reproducing Kernel Hilbert spaces to prove new Representer theorems that provide theoretical basis for the algorithms. As a result (in contrast to purely
Max-margin Markov networks
, 2003
"... In typical classification tasks, we seek a function which assigns a label to a single object. Kernel-based approaches, such as support vector machines (SVMs), which maximize the margin of confidence of the classifier, are the method of choice for many such tasks. Their popularity stems both from the ..."
Abstract
-
Cited by 604 (15 self)
- Add to MetaCart
the ability to use high-dimensional feature spaces, and from their strong theoretical guarantees. However, many real-world tasks involve sequential, spatial, or structured data, where multiple labels must be assigned. Existing kernel-based methods ignore structure in the problem, assigning labels
Dimensionality reduction for supervised learning with reproducing kernel Hilbert spaces
- Journal of Machine Learning Research
, 2004
"... We propose a novel method of dimensionality reduction for supervised learning problems. Given a regression or classification problem in which we wish to predict a response variable Y from an explanatory variable X, we treat the problem of dimensionality reduction as that of finding a low-dimensional ..."
Abstract
-
Cited by 162 (34 self)
- Add to MetaCart
using covariance operators on reproducing kernel Hilbert spaces. This characterization allows us to derive a contrast function for estimation of the effective subspace. Unlike many conventional methods for dimensionality reduction in supervised learning, the proposed method requires neither assumptions
Kernel partial least squares regression in reproducing kernel Hilbert space
- JOURNAL OF MACHINE LEARNING RESEARCH
, 2001
"... A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the late ..."
Abstract
-
Cited by 154 (10 self)
- Add to MetaCart
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables
Consistency of the group lasso and multiple kernel learning
- JOURNAL OF MACHINE LEARNING RESEARCH
, 2007
"... We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it ..."
Abstract
-
Cited by 274 (33 self)
- Add to MetaCart
are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covariance operators
Regularized multi-task learning
, 2004
"... This paper provides a foundation for multi–task learning using reproducing kernel Hilbert spaces of vector–valued functions. In this setting, the kernel is a matrix–valued function. Some explicit examples will be described which go beyond our earlier results in [7]. In particular, we characterize cl ..."
Abstract
-
Cited by 277 (2 self)
- Add to MetaCart
This paper provides a foundation for multi–task learning using reproducing kernel Hilbert spaces of vector–valued functions. In this setting, the kernel is a matrix–valued function. Some explicit examples will be described which go beyond our earlier results in [7]. In particular, we characterize
Learnability in Hilbert spaces with Reproducing Kernels
- Journal of Complexity
, 2002
"... We explore the question of learnability of classes of functions contained in a Hilbert space which has a reproducing kernel. We show that if the evaluation functionals are uniformly bounded and if the class is norm bounded then it is learnable. We formulate a learning procedure related to the well k ..."
Abstract
-
Cited by 6 (5 self)
- Add to MetaCart
We explore the question of learnability of classes of functions contained in a Hilbert space which has a reproducing kernel. We show that if the evaluation functionals are uniformly bounded and if the class is norm bounded then it is learnable. We formulate a learning procedure related to the well
A Reproducing Kernel Hilbert Space Framework for Pairwise Time Series Distances
"... A good distance measure for time series needs to properly incorporate the temporal structure, and should be applicable to sequences with unequal lengths. In this paper, we propose a distance measure as a principled solution to the two requirements. Unlike the conventional feature vector representati ..."
Abstract
-
Cited by 11 (0 self)
- Add to MetaCart
representation, our approach represents each time series with a summarizing smooth curve in a reproducing kernel Hilbert space (RKHS), and therefore translate the distance between time series into distances between curves. Moreover we propose to learn the kernel of this RKHS from a population of time series
Results 1 - 10
of
414