Results 1  10
of
875,645
1Bayesian Extensions of Kernel Least Mean Squares
"... Abstract—The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that “kernelizes ” the celebrated (linear) least mean squares algorithm. We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering, and thu ..."
Abstract
 Add to MetaCart
Abstract—The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that “kernelizes ” the celebrated (linear) least mean squares algorithm. We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering
Image denoising using a scale mixture of Gaussians in the wavelet domain
 IEEE TRANS IMAGE PROCESSING
, 2003
"... We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vecto ..."
Abstract

Cited by 512 (17 self)
 Add to MetaCart
vector and a hidden positive scalar multiplier. The latter modulates the local variance of the coefficients in the neighborhood, and is thus able to account for the empirically observed correlation between the coefficient amplitudes. Under this model, the Bayesian least squares estimate of each
Manifold regularization: A geometric framework for learning from labeled and unlabeled examples
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semisupervised framework that incorporates labeled and unlabeled data in a generalpurpose learner. Some transductive graph learning al ..."
Abstract

Cited by 576 (16 self)
 Add to MetaCart
algorithms and standard methods including Support Vector Machines and Regularized Least Squares can be obtained as special cases. We utilize properties of Reproducing Kernel Hilbert spaces to prove new Representer theorems that provide theoretical basis for the algorithms. As a result (in contrast to purely
The Kernel Recursive Least Squares Algorithm
 IEEE Transactions on Signal Processing
, 2003
"... We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared error regressor. Spars ..."
Abstract

Cited by 141 (2 self)
 Add to MetaCart
We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared error regressor
The Kernel Recursive LeastSquares Algorithm
"... Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a highdimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum meansquarederror solutions to nonlinear leasts ..."
Abstract
 Add to MetaCart
Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a highdimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum meansquarederror solutions to nonlinear leastsquares
Kernel Recursive Least Squares
 IEEE Transactions on Signal Processing
, 2004
"... We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared error regressor. Sparsity (and ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared error regressor. Sparsity
The quaternion kernel least squares
 In Proc. of ICASSP
, 2013
"... The quaternion kernel least squares algorithm (QKLS) is introduced as a generic kernel framework for the estimation of multivariate quaternion valued signals. This is achieved based on the concepts of quaternion inner product and quaternion positive definiteness, allowing us to define quaternion k ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
kernel regression. Next, the least squares solution is derived using the recently introducedHR calculus. We also show that QKLS is a generic extension of standard kernel least squares, and their equivalence is established for real valued kernels. The superiority of the quaternionvalued linear kernel
2. Complex Kernel Least Mean Square
"... • A unifying approach to the class of Kernel LMS (KLMS) algorithm • Introduction to complexvalued kernel adaptive learning • Augmented (widely linear) extension of the KLMS (ACKLMS) • Application in twodimensional wind prediction ..."
Abstract
 Add to MetaCart
• A unifying approach to the class of Kernel LMS (KLMS) algorithm • Introduction to complexvalued kernel adaptive learning • Augmented (widely linear) extension of the KLMS (ACKLMS) • Application in twodimensional wind prediction
Results 1  10
of
875,645