Results 1 
6 of
6
Kernel partial least squares regression in reproducing kernel hilbert space
 Journal of Machine Learning Research
, 2001
"... A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the late ..."
Abstract

Cited by 106 (5 self)
 Add to MetaCart
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the latent variables (components). However, in contrast to PCR, PLS creates the components by modeling the relationship between input and output variables while maintaining most of the information in the input variables. PLS is useful in situations where the number of explanatory variables exceeds the number of observations and/or a high level of multicollinearity among those variables is assumed. Motivated by this fact we will provide a kernel PLS algorithm for construction of nonlinear regression models in possibly highdimensional feature spaces. We give the theoretical description of the kernel PLS algorithm and we experimentally compare the algorithm with the existing kernel PCR and kernel ridge regression techniques. We will demonstrate that on the data sets employed kernel PLS achieves the same results as kernel PCR but uses significantly fewer, qualitatively different components. 1.
Multiclass cancer classification by total principal component regression (TPCR) using microarray gene expression data
, 2005
"... ..."
Fast CrossValidation of HighBreakdown Resampling Methods for PCA
, 2006
"... Crossvalidation (CV) is a very popular technique for model selection and model validation. The general procedure of leaveoneout CV is to exclude one observation from the data set, to construct the fit of the remaining observations and to evaluate that fit on the item that was left out. In classi ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Crossvalidation (CV) is a very popular technique for model selection and model validation. The general procedure of leaveoneout CV is to exclude one observation from the data set, to construct the fit of the remaining observations and to evaluate that fit on the item that was left out. In classical procedures such as leastsquares regression or kernel density estimation, easy formulas can be derived to compute this crossvalidated fit or the residuals of the removed observations. However, when highbreakdown resampling algorithms are used, it is no longer possible to derive such closedform expressions. Highbreakdown methods are developed to obtain estimates that can withstand the effects of outlying observations. Fast algorithms are presented for leaveoneout CV when using a highbreakdown method based on resampling, in the context of robust covariance estimation by means of the MCD estimator and robust Principal Component Analysis. A robust PRESS curve is introduced as an exploratory tool to select the number of principal components. Simulation results and applications on real data show the accuracy and the gain in computation time of these fast crossvalidation algorithms.
Applications of integrated sensing and processing in spectroscopic imaging and sensing
"... Integrated sensing and processing (ISP) encompasses the use of optical computing and adapted excitation signals to physically implement chemometric calculations in spectroscopic sensors for imaging. As data sets become larger and more complex with each emerging generation of hyperspectral imagers, t ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Integrated sensing and processing (ISP) encompasses the use of optical computing and adapted excitation signals to physically implement chemometric calculations in spectroscopic sensors for imaging. As data sets become larger and more complex with each emerging generation of hyperspectral imagers, the ‘pixeltopupil ’ ratio increases at a rate faster than computing power can accommodate. In response to the need for faster and more efficient methods of processing, many analog solutions to the problem of high data dimensionality have emerged. The successful development of ISP has strong implications for military imaging, biosensing, spectroscopic imaging, and pharmaceutical process analytical technology (PAT). ISP developments in spectroscopy and PAT have emerged as alternatives to conventional Fourier transform infrared (FTIR), nearinfrared (NIR), IR, UV–visible, fluorescence, Raman, and acousticresonance spectrometry (ARS). Flourishing applications of ISP have demonstrated predictive ability equivalent to conventional approaches for sample differentiation and analyte quantification, in only a fraction of the time required for
unknown title
, 2004
"... Multiclass cancer classification by total principal component regression (TPCR) using microarray gene expression data ..."
Abstract
 Add to MetaCart
Multiclass cancer classification by total principal component regression (TPCR) using microarray gene expression data
unknown title
, 2004
"... Multiclass cancer classification by total principal component regression (TPCR) using microarray gene expression data ..."
Abstract
 Add to MetaCart
Multiclass cancer classification by total principal component regression (TPCR) using microarray gene expression data