Results 1  10
of
163
FINDING STRUCTURE WITH RANDOMNESS: PROBABILISTIC ALGORITHMS FOR CONSTRUCTING APPROXIMATE MATRIX DECOMPOSITIONS
"... Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for ..."
Abstract

Cited by 253 (6 self)
 Add to MetaCart
Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys and extends recent research which demonstrates that randomization offers a powerful tool
Solving rankdeficient and illposed problems using UTV and QR factorizations
 SIAM J. Matrix Annal. App
"... The algorithm of Mathias and Stewart [A block QR algorithm and the singular value decomposition, Linear Algebra and Its Applications, 182:91100, 1993] is examined as a tool for constructing regularized solutions to rankdeficient and illposed linear equations. The algorithm is based on a sequence ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
The algorithm of Mathias and Stewart [A block QR algorithm and the singular value decomposition, Linear Algebra and Its Applications, 182:91100, 1993] is examined as a tool for constructing regularized solutions to rankdeficient and illposed linear equations. The algorithm is based on a sequence
A Perturbation Analysis for R in the QR Factorization
 In preparation
, 1995
"... We present new normwise and componentwise perturbation analyses for the R factor of the QR factorization A = Q1R of an m \Theta n matrix A with full column rank. The analyses more accurately reflect the sensitivity of the problem than previous normwise and componentwise results. The new condition nu ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
We present new normwise and componentwise perturbation analyses for the R factor of the QR factorization A = Q1R of an m \Theta n matrix A with full column rank. The analyses more accurately reflect the sensitivity of the problem than previous normwise and componentwise results. The new condition
Using perturbed QR factorizations to solve linear leastsquares problems
 SIAM J. Matrix Anal. Appl
, 2009
"... Introduction This talk will show that the R factor from the QR factorization of a perturbation A of a matrix A is an effective leastsquares preconditioner for A. More specifically, we will show that the R factor of the perturbation is an effective preconditioner if the perturbation can be expresse ..."
Abstract

Cited by 8 (8 self)
 Add to MetaCart
. They allow us to solve whatif scenarios. They allow us to solve numerically rankdeficient leastsquares problem without a rankrevealing factorization. Some of the results presented where already known experimentally (for example in [?]), but apparently without an analysis of eigenvalues.
Efficient Rankone Residue Approximation Method for Graph Regularized Nonnegative Matrix Factorization
"... Abstract. Nonnegative matrix factorization (NMF) aims to decompose a given data matrix X into the product of two lowerrank nonnegative factor matrices UV T. Graph regularized NMF (GNMF) is a recently proposed NMF method that preserves the geometric structure of X during such decomposition. Althoug ..."
Abstract
 Add to MetaCart
Abstract. Nonnegative matrix factorization (NMF) aims to decompose a given data matrix X into the product of two lowerrank nonnegative factor matrices UV T. Graph regularized NMF (GNMF) is a recently proposed NMF method that preserves the geometric structure of X during such decomposition
Online Learning in the Embedded Manifold of Lowrank Matrices
"... When learning models that are represented in matrix forms, enforcing a lowrank constraint can dramatically improve the memory and run time complexity, while providing a natural regularization of the model. However, naive approaches to minimizing functions over the set of lowrank matrices are eithe ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
are either prohibitively time consuming (repeated singular value decomposition of the matrix) or numerically unstable (optimizing a factored representation of the lowrank matrix). We build on recent advances in optimization over manifolds, and describe an iterative online learning procedure, consisting of a
Finding structure with randomness: Stochastic algorithms for constructing approximate matrix decompositions
, 2009
"... Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys recent research which demonstrates that randomization offers a powerful tool for performing l ..."
Abstract

Cited by 62 (4 self)
 Add to MetaCart
Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys recent research which demonstrates that randomization offers a powerful tool for performing
LimitedMemory Fast Gradient Descent Method for Graph Regularized Nonnegative Matrix Factorization
"... Graph regularized nonnegative matrix factorization (GNMF) decomposes a nonnegative data matrix X[Rmn to the product of two lowerrank nonnegative factor matrices, i.e., W[Rmr and H[Rrn (rvminfm,ng) and aims to preserve the local geometric structure of the dataset by minimizing squared Euclidean d ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Graph regularized nonnegative matrix factorization (GNMF) decomposes a nonnegative data matrix X[Rmn to the product of two lowerrank nonnegative factor matrices, i.e., W[Rmr and H[Rrn (rvminfm,ng) and aims to preserve the local geometric structure of the dataset by minimizing squared Euclidean
Online Learning in the Manifold of LowRank Matrices
"... When learning models that are represented in matrix forms, enforcing a lowrank constraint can dramatically improve the memory and run time complexity, while providing a natural regularization of the model. However, naive approaches for minimizing functions over the set of lowrank matrices are eith ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
are either prohibitively time consuming (repeated singular value decomposition of the matrix) or numerically unstable (optimizing a factored representation of the low rank matrix). We build on recent advances in optimization over manifolds, and describe an iterative online learning procedure, consisting of a
ElasticNet Regularization of Singular Values for Robust Subspace Learning
"... Learning a lowdimensional structure plays an important role in computer vision. Recently, a new family of methods, such as l1 minimization and robust principal component analysis, has been proposed for lowrank matrix approximation problems and shown to be robust against outliers and missing da ..."
Abstract
 Add to MetaCart
data. But these methods often require heavy computational load and can fail to find a solution when highly corrupted data are presented. In this paper, an elasticnet regularization based lowrank matrix factorization method for subspace learning is proposed. The proposed method finds a robust
Results 1  10
of
163