Results 1  10
of
914
1Bayesian Extensions of Kernel Least Mean Squares
"... Abstract—The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that “kernelizes ” the celebrated (linear) least mean squares algorithm. We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering, and thu ..."
Abstract
 Add to MetaCart
Abstract—The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that “kernelizes ” the celebrated (linear) least mean squares algorithm. We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering
Image denoising using a scale mixture of Gaussians in the wavelet domain
 IEEE TRANS IMAGE PROCESSING
, 2003
"... We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vecto ..."
Abstract

Cited by 513 (17 self)
 Add to MetaCart
vector and a hidden positive scalar multiplier. The latter modulates the local variance of the coefficients in the neighborhood, and is thus able to account for the empirically observed correlation between the coefficient amplitudes. Under this model, the Bayesian least squares estimate of each
Manifold regularization: A geometric framework for learning from labeled and unlabeled examples
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semisupervised framework that incorporates labeled and unlabeled data in a generalpurpose learner. Some transductive graph learning al ..."
Abstract

Cited by 578 (16 self)
 Add to MetaCart
algorithms and standard methods including Support Vector Machines and Regularized Least Squares can be obtained as special cases. We utilize properties of Reproducing Kernel Hilbert spaces to prove new Representer theorems that provide theoretical basis for the algorithms. As a result (in contrast to purely
The Kernel Recursive Least Squares Algorithm
 IEEE Transactions on Signal Processing
, 2003
"... We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared error regressor. Spars ..."
Abstract

Cited by 141 (2 self)
 Add to MetaCart
We present a nonlinear kernelbased version of the Recursive Least Squares (RLS) algorithm. Our KernelRLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum mean squared error regressor
2. Complex Kernel Least Mean Square
"... • A unifying approach to the class of Kernel LMS (KLMS) algorithm • Introduction to complexvalued kernel adaptive learning • Augmented (widely linear) extension of the KLMS (ACKLMS) • Application in twodimensional wind prediction ..."
Abstract
 Add to MetaCart
• A unifying approach to the class of Kernel LMS (KLMS) algorithm • Introduction to complexvalued kernel adaptive learning • Augmented (widely linear) extension of the KLMS (ACKLMS) • Application in twodimensional wind prediction
Mixture Kernel Least Mean Square
"... Abstract—Instead of using single kernel, different approaches of using multiple kernels have been proposed recently in kernel learning literature, one of which is multiple kernel learning (MKL). In this paper, we propose an alternative to MKL in order to select the appropriate kernel given a pool of ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
mixture of models. We propose mixture kernel least mean square (MxKLMS) adaptive filtering algorithm, where the kernel least mean square (KLMS) filters learned with different kernels, act in parallel at each input instance and are competitively combined such that the filter with the best kernel
The quaternion kernel least squares
 In Proc. of ICASSP
, 2013
"... The quaternion kernel least squares algorithm (QKLS) is introduced as a generic kernel framework for the estimation of multivariate quaternion valued signals. This is achieved based on the concepts of quaternion inner product and quaternion positive definiteness, allowing us to define quaternion k ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
kernel regression. Next, the least squares solution is derived using the recently introducedHR calculus. We also show that QKLS is a generic extension of standard kernel least squares, and their equivalence is established for real valued kernels. The superiority of the quaternionvalued linear kernel
Quantized Mixture Kernel Least Mean Square
"... Abstract—Use of multiple kernels in the conventional kernel algorithms is gaining much popularity as it addresses the kernel selection problem as well as improves the performance. Kernel least mean square (KLMS) has been extended to multiple kernels recently using different approaches, one of which ..."
Abstract
 Add to MetaCart
Abstract—Use of multiple kernels in the conventional kernel algorithms is gaining much popularity as it addresses the kernel selection problem as well as improves the performance. Kernel least mean square (KLMS) has been extended to multiple kernels recently using different approaches, one of which
The Kernel Recursive LeastSquares Algorithm
"... Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a highdimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum meansquarederror solutions to nonlinear leasts ..."
Abstract
 Add to MetaCart
Abstract—We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a highdimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum meansquarederror solutions to nonlinear leastsquares
Results 1  10
of
914