Results 1  10
of
182
The Kernel Recursive Least Squares Algorithm
 IEEE Transactions on Signal Processing
, 2003
"... We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared error regressor. Spars ..."
Abstract

Cited by 141 (2 self)
 Add to MetaCart
We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared error regressor
Kernel Recursive Least Squares
 IEEE Transactions on Signal Processing
, 2004
"... We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared error regressor. Sparsity (and ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared error regressor. Sparsity
Multivariate online anomaly detection using kernel recursive least squares
 in Proc. IEEE Infocom
, 2007
"... Abstract — Highspeed backbones are regularly affected by various kinds of network anomalies, ranging from malicious attacks to harmless large data transfers. Different types of anomalies affect the network in different ways, and it is difficult to know a priori how a potential anomaly will exhibit ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
itself in traffic statistics. In this paper we describe an online, sequential, anomaly detection algorithm, that is suitable for use with multivariate data. The proposed algorithm is based on the kernel version of the recursive least squares algorithm. It assumes no model for network traffic or anomalies
The Kernel Recursive LeastSquares Algorithm
"... Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a highdimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum meansquarederror solutions to nonlinear leasts ..."
Abstract
 Add to MetaCart
Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a highdimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum meansquarederror solutions to nonlinear leastsquares
Kernel partial least squares regression in reproducing kernel Hilbert space
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2001
"... A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the late ..."
Abstract

Cited by 154 (10 self)
 Add to MetaCart
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables
Recurrent least squares learning for quasiparallel principal component analysis
 In: Proc. European Symposium on Artificial Neural Networks ESANN'96
, 1996
"... Abstract. The recurrent least squares (RLS) learning approach is proposed for controlling the learning rate in parallel principal subspace analysis (PSA) and in a wide class of principal component analysis (PCA) associated algorithms with a quasi{parallel extraction ability. The purpose is to provid ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. The recurrent least squares (RLS) learning approach is proposed for controlling the learning rate in parallel principal subspace analysis (PSA) and in a wide class of principal component analysis (PCA) associated algorithms with a quasi{parallel extraction ability. The purpose
Bootstrapping Particle Filters using Kernel Recursive Least Squares
"... Abstract—Although particle filters are extremely effective algorithms for object tracking, one of their limitations is a reliance on an accurate model for the object dynamics and observation mechanism. The limitation is circumvented to some extent by the incorporation of parameterized models in the ..."
Abstract
 Add to MetaCart
in the filter, with simultaneous online learning of model parameters, but frequently, identification of an appropriate parametric model is extremely difficult. This paper addresses this problem, describing an algorithm that combines Kernel Recursive Least Squares and particle filtering to learn a functional
GaborBased Kernel PartialLeastSquares Discrimination Features for Face Recognition ∗
, 2007
"... Abstract. The paper presents a novel method for the extraction of facial features based on the Gaborwavelet representation of face images and the kernel partialleastsquares discrimination (KPLSD) algorithm. The proposed featureextraction method, called the Gaborbased kernel partialleastsquare ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Abstract. The paper presents a novel method for the extraction of facial features based on the Gaborwavelet representation of face images and the kernel partialleastsquares discrimination (KPLSD) algorithm. The proposed featureextraction method, called the Gaborbased kernel partialleastsquares
Adaptive Kernel Principal Analysis for Online Feature Extraction
"... Abstract—The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or largescale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contri ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy. Keywords—adaptive method, kernel principal component analysis, online extraction, recursive algorithm. K I.
Kernel Partial Least Squares is Universally Consistent
, 2010
"... We prove the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of wellknown classical approaches as e.g. Ridge Regression or Principal Components Regression, as ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
We prove the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of wellknown classical approaches as e.g. Ridge Regression or Principal Components Regression
Results 1  10
of
182