Results 1  10
of
235
Local Regularized LeastSquare Dimensionality Reduction
"... In this paper, we propose a new nonlinear dimensionality reduction algorithm by adopting regularized leastsquare criterion on local areas of the data distribution. We first propose a local linear model to describe the characteristic of the lowdimensional coordinates of the neighborhood centered in ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper, we propose a new nonlinear dimensionality reduction algorithm by adopting regularized leastsquare criterion on local areas of the data distribution. We first propose a local linear model to describe the characteristic of the lowdimensional coordinates of the neighborhood centered
Incremental Online Learning in High Dimensions
 Neural Computation
, 2005
"... Locally weighted projection regression (LWPR) is a new algorithm for incremental nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its core, it employs nonparametric regression with locally linear models. In order to stay computationally e ..."
Abstract

Cited by 164 (19 self)
 Add to MetaCart
with a large number of  possibly redundant  inputs, as shown in various empirical evaluations with up to 90 dimensional data sets. For a probabilistic interpretation, predictive variance and confidence intervals are derived. To our knowledge, LWPR is the first truly incremental spatially localized
Least square incremental linear discriminant analysis
 In Proceedings of the 2009 Ninth IEEE International Conference on Data Mining, ICDM ’09
, 2009
"... Abstract—Linear discriminant analysis (LDA) is a wellknown dimension reduction approach, which projects highdimensional data into a lowdimensional space with the best separation of different classes. In many tasks, the data accumulates over time, and thus incremental LDA is more desirable than b ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
instances in ddimensional space. Experimental results show that comparing with stateoftheart incremental LDA algorithms, our proposed LSILDA achieves high accuracy with low time cost. KeywordsDimension reduction; linear discriminant analysis (LDA); incremental learning; least square I.
Parallel stochastic gradient algorithms for largescale matrix completion
 MATHEMATICAL PROGRAMMING COMPUTATION
, 2013
"... This paper develops Jellyfish, an algorithm for solving dataprocessing problems with matrixvalued decision variables regularized to have low rank. Particular examples of problems solvable by Jellyfish include matrix completion problems and leastsquares problems regularized by the nuclear norm or ..."
Abstract

Cited by 74 (8 self)
 Add to MetaCart
This paper develops Jellyfish, an algorithm for solving dataprocessing problems with matrixvalued decision variables regularized to have low rank. Particular examples of problems solvable by Jellyfish include matrix completion problems and leastsquares problems regularized by the nuclear norm
SRDA: An Efficient Algorithm for LargeScale Discriminant Analysis
 IEEE Transactions on Knowledge and Data Engineering
, 2008
"... Abstract—Linear Discriminant Analysis (LDA) has been a popular method for extracting features that preserves class separability. The projection functions of LDA are commonly obtained by maximizing the betweenclass covariance and simultaneously minimizing the withinclass covariance. It has been wid ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
Discriminant Analysis (SRDA). By using spectral graph analysis, SRDA casts discriminant analysis into a regression framework that facilitates both efficient computation and the use of regularization techniques. Specifically, SRDA only needs to solve a set of regularized least squares problems
Regularized partial least squares with an application to nmr spectroscopy
 Statistical Analysis and Data Mining
"... Highdimensional data common in genomics, proteomics, and chemometrics often contains complicated correlation structures. Recently, partial least squares (PLS) and Sparse PLS methods have gained attention in these areas as dimension reduction techniques in the context of supervised data analysis. W ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Highdimensional data common in genomics, proteomics, and chemometrics often contains complicated correlation structures. Recently, partial least squares (PLS) and Sparse PLS methods have gained attention in these areas as dimension reduction techniques in the context of supervised data analysis
Coil sensitivity encoding for fast MRI. In:
 Proceedings of the ISMRM 6th Annual Meeting,
, 1998
"... New theoretical and practical concepts are presented for considerably enhancing the performance of magnetic resonance imaging (MRI) by means of arrays of multiple receiver coils. Sensitivity encoding (SENSE) is based on the fact that receiver sensitivity generally has an encoding effect complementa ..."
Abstract

Cited by 193 (3 self)
 Add to MetaCart
. Here we discuss two concepts. The first approach is to choose those voxel functions that exhibit the least square deviation from the ideal. This criterion entirely determines reconstruction; the approach is therefore referred to as the strong one. In Appendix B it is shown that it yields where C
Least Squares Linear Discriminant Analysis
"... Linear Discriminant Analysis (LDA) is a wellknown method for dimensionality reduction and classification. LDA in the binaryclass case has been shown to be equivalent to linear regression with the class label as the output. This implies that LDA for binaryclass classifications can be formulated as ..."
Abstract
 Add to MetaCart
as a least squares problem. With the least squares formulation, LDA can be applied to large scale classifications by employing existing methods for solving least squares problems such as those based on conjugate gradient. However many realworld applications involve multiclass classifications, where a
A generalized leastsquare matrix decomposition
 Journal of the American Statistical Association
"... Variables in many massive highdimensional data sets are structured, arising for example from measurements on a regular grid as in imaging and time series or from spatialtemporal measurements as in climate studies. Classical multivariate techniques ignore these structural relationships often result ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
to a transposable quadratic norm, our decomposition, entitled the Generalized least squares Matrix Decomposition (GMD), directly accounts for structural relationships. As many variables in highdimensional settings are often irrelevant or noisy, we also regularize our matrix decomposition by adding
A Least Squares Formulation for Canonical Correlation Analysis
"... Canonical Correlation Analysis (CCA) is a wellknown technique for finding the correlations between two sets of multidimensional variables. It projects both sets of variables into a lowerdimensional space in which they are maximally correlated. CCA is commonly applied for supervised dimensionality ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
mild condition which tends to hold for highdimensional data, CCA in multilabel classifications can be formulated as a least squares problem. Based on this equivalence relationship, we propose several CCA extensions including sparse CCA using 1norm regularization. Experiments on multilabel data sets
Results 1  10
of
235