## The Kernel Recursive Least Squares Algorithm (2003)

### Cached

### Download Links

- [www-ee.technion.ac.il]
- [webee.technion.ac.il]
- CiteULike

### Other Repositories/Bibliography

Venue: | IEEE Transactions on Signal Processing |

Citations: | 61 - 2 self |

### BibTeX

@ARTICLE{Engel03thekernel,

author = {Yaakov Engel and Shie Mannor and Ron Meir},

title = {The Kernel Recursive Least Squares Algorithm},

journal = {IEEE Transactions on Signal Processing},

year = {2003},

volume = {52},

pages = {2275--2285}

}

### Years of Citing Articles

### OpenURL

### Abstract

We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared -error regressor. Sparsity of the solution is achieved by a sequential sparsification process that admits into the kernel representation a new input sample only if its feature space image cannot be suffciently well approximated by combining the images of previously admitted samples. This sparsification procedure is crucial to the operation of KRLS, as it allows it to operate on-line, and by effectively regularizing its solutions. A theoretical analysis of the sparsification method reveals its close affinity to kernel PCA, and a data-dependent loss bound is presented, quantifying the generalization performance of the KRLS algorithm. We demonstrate the performance and scaling properties of KRLS and compare it to a stateof -the-art Support Vector Regression algorithm, using both synthetic and real data. We additionally test KRLS on two signal processing problems in which the use of traditional least-squares methods is commonplace: Time series prediction and channel equalization.