• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 182
Next 10 →

The Kernel Recursive Least Squares Algorithm

by Yaakov Engel, Shie Mannor, Ron Meir - IEEE Transactions on Signal Processing , 2003
"... We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared -error regressor. Spars ..."
Abstract - Cited by 141 (2 self) - Add to MetaCart
We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared -error regressor

Kernel Recursive Least Squares

by Yaakov Engel, Shie Mannor, Ron Meir - IEEE Transactions on Signal Processing , 2004
"... We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared -error regressor. Sparsity (and ..."
Abstract - Cited by 13 (0 self) - Add to MetaCart
We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared -error regressor. Sparsity

Multivariate online anomaly detection using kernel recursive least squares

by Tarem Ahmed, Mark Coates, Anukool Lakhina - in Proc. IEEE Infocom , 2007
"... Abstract — High-speed backbones are regularly affected by various kinds of network anomalies, ranging from malicious attacks to harmless large data transfers. Different types of anomalies affect the network in different ways, and it is difficult to know a priori how a potential anomaly will exhibit ..."
Abstract - Cited by 26 (4 self) - Add to MetaCart
itself in traffic statistics. In this paper we describe an online, sequential, anomaly detection algorithm, that is suitable for use with multivariate data. The proposed algorithm is based on the kernel version of the recursive least squares algorithm. It assumes no model for network traffic or anomalies

The Kernel Recursive Least-Squares Algorithm

by unknown authors
"... Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a high-dimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum mean-squared-error solutions to nonlinear least-s ..."
Abstract - Add to MetaCart
Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a high-dimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum mean-squared-error solutions to nonlinear least-squares

Kernel partial least squares regression in reproducing kernel Hilbert space

by Roman Rosipal, Leonard J. Trejo - JOURNAL OF MACHINE LEARNING RESEARCH , 2001
"... A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the late ..."
Abstract - Cited by 154 (10 self) - Add to MetaCart
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables

Recurrent least squares learning for quasi-parallel principal component analysis

by W Kasprzak, Andrzej Cichocki - In: Proc. European Symposium on Artificial Neural Networks ESANN'96 , 1996
"... Abstract. The recurrent least squares (RLS) learning approach is proposed for controlling the learning rate in parallel principal subspace analysis (PSA) and in a wide class of principal component analysis (PCA) associated algorithms with a quasi{parallel extraction ability. The purpose is to provid ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
Abstract. The recurrent least squares (RLS) learning approach is proposed for controlling the learning rate in parallel principal subspace analysis (PSA) and in a wide class of principal component analysis (PCA) associated algorithms with a quasi{parallel extraction ability. The purpose

Bootstrapping Particle Filters using Kernel Recursive Least Squares

by Boris Oreshkin, Mark Coates
"... Abstract—Although particle filters are extremely effective algorithms for object tracking, one of their limitations is a reliance on an accurate model for the object dynamics and observation mechanism. The limitation is circumvented to some extent by the incorporation of parameterized models in the ..."
Abstract - Add to MetaCart
in the filter, with simultaneous on-line learning of model parameters, but frequently, identification of an appropriate parametric model is extremely difficult. This paper addresses this problem, describing an algorithm that combines Kernel Recursive Least Squares and particle filtering to learn a functional

Gabor-Based Kernel Partial-Least-Squares Discrimination Features for Face Recognition ∗

by Vitomir Štruc, Nikola Pavešić , 2007
"... Abstract. The paper presents a novel method for the extraction of facial features based on the Gabor-wavelet representation of face images and the kernel partial-least-squares discrimination (KPLSD) algorithm. The proposed feature-extraction method, called the Gabor-based kernel partial-least-square ..."
Abstract - Cited by 21 (0 self) - Add to MetaCart
Abstract. The paper presents a novel method for the extraction of facial features based on the Gabor-wavelet representation of face images and the kernel partial-least-squares discrimination (KPLSD) algorithm. The proposed feature-extraction method, called the Gabor-based kernel partial-least-squares

Adaptive Kernel Principal Analysis for Online Feature Extraction

by Mingtao Ding, Zheng Tian, Haixia Xu
"... Abstract—The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contri ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy. Keywords—adaptive method, kernel principal component analysis, online extraction, recursive algorithm. K I.

Kernel Partial Least Squares is Universally Consistent

by Gilles Blanchard, Nicole Krämer , 2010
"... We prove the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of well-known classical approaches as e.g. Ridge Regression or Principal Components Regression, as ..."
Abstract - Cited by 4 (3 self) - Add to MetaCart
We prove the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of well-known classical approaches as e.g. Ridge Regression or Principal Components Regression
Next 10 →
Results 1 - 10 of 182
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University