Results 1  10
of
1,245,655
On low rank perturbations of matrices
, 2008
"... The article is devoted to different aspects of the question: ”What can be done with a matrix by a low rank perturbation? ” It is proved that one can change a geometrically simple spectrum drastically by a rank 1 perturbation, but the situation is quite different if one restricts oneself to normal ma ..."
Abstract
 Add to MetaCart
The article is devoted to different aspects of the question: ”What can be done with a matrix by a low rank perturbation? ” It is proved that one can change a geometrically simple spectrum drastically by a rank 1 perturbation, but the situation is quite different if one restricts oneself to normal
Low rank perturbation . . .
 LINEAR ALGEBRA AND ITS APPLICATIONS 430 (2009) 579–586
, 2009
"... ..."
LOW RANK PERTURBATION OF JORDAN STRUCTURE
, 2003
"... Let A be a matrix and λ0 be one of its eigenvalues having g elementary Jordan blocks in the Jordan canonical form of A. We show that for most matrices B satisfying rank (B) ≤ g, the Jordan blocks of A + B with eigenvalue λ0 are just the g − rank (B) smallest Jordan blocks of A with eigenvalue λ0. ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
. The set of matrices for which this behavior does not happen is explicitly characterized through a scalar determinantal equation involving B and some of the λ0eigenvectors of A. Thus, except for a set of zero Lebesgue measure, a low rank perturbation A + B of A destroys for each of its eigenvalues exactly
Approximating the Covariance Matrix with Lowrank Perturbations
"... Abstract. Covariance matrices capture correlations that are invaluable in modeling reallife datasets. Using all d 2 elements of the covariance (in d dimensions) is costly and could result in overfitting; and the simple diagonal approximation can be overrestrictive. We present an algorithm that im ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
that improves upon the diagonal matrix by allowing a low rank perturbation. The efficiency is comparable to the diagonal approximation, yet one can capture correlations among the dimensions. We show that this method outperforms the diagonal when training GMMs on both synthetic and realworld data. Keywords
Approximating the Covariance Matrix of GMMs with Lowrank Perturbations
"... Covariance matrices capture correlations that are invaluable in modeling reallife datasets. Using all d 2 elements of the covariance (in d dimensions) is costly and could result in overfitting; and the simple diagonal approximation can be overrestrictive. In this work, we present a new model, the ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
, the LowRank Gaussian Mixture Model (LRGMM), for modeling data which can be extended to identifying partitions or overlapping clusters. The curse of dimensionality that arises in calculating the covariance matrices of the GMM is countered by using lowrank perturbed diagonal matrices. The efficiency
HIGHER RANK NUMERICAL RANGES AND LOW RANK PERTURBATIONS OF QUANTUM CHANNELS
, 710
"... Abstract. For a positive integer k, the rankk numerical range Λk(A) of an operator A acting on a Hilbert space H of dimension at least k is the set of scalars λ such that PAP = λP for some rank k orthogonal projection P. In this paper, a close connection between low rank perturbation of an operator ..."
Abstract
 Add to MetaCart
Abstract. For a positive integer k, the rankk numerical range Λk(A) of an operator A acting on a Hilbert space H of dimension at least k is the set of scalars λ such that PAP = λP for some rank k orthogonal projection P. In this paper, a close connection between low rank perturbation
The singular values and vectors of low rank perturbations of large rectangular random matrices
 J. Multivariate Anal
"... Abstract. In this paper, we consider the singular values and singular vectors of finite, low rank perturbations of large rectangular random matrices. Specifically, we prove almost sure convergence of the extreme singular values and appropriate projections of the corresponding singular vectors of the ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
Abstract. In this paper, we consider the singular values and singular vectors of finite, low rank perturbations of large rectangular random matrices. Specifically, we prove almost sure convergence of the extreme singular values and appropriate projections of the corresponding singular vectors
Higher rank numerical ranges and low rank perturbation of quantum channels
 J. Math. Anal. Appl
"... Abstract. For a positive integer k, the rankk numerical range Λk(A) of an operator A acting on a Hilbert space H of dimension at least k is the set of scalars λ such that P AP = λP for some rank k orthogonal projection P. In this paper, a close connection between low rank perturbation of an operato ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
Abstract. For a positive integer k, the rankk numerical range Λk(A) of an operator A acting on a Hilbert space H of dimension at least k is the set of scalars λ such that P AP = λP for some rank k orthogonal projection P. In this paper, a close connection between low rank perturbation
HIGHER RANK NUMERICAL RANGES AND LOW RANK PERTURBATIONS OF QUANTUM CHANNELS
, 710
"... Abstract. For a positive integer k, the rankk numerical range Λk(A) of an operator A acting on a Hilbert space H of dimension at least k is the set of scalars λ such that PAP = λP for some rank k orthogonal projection P. In this paper, a close connection between low rank perturbation of an operator ..."
Abstract
 Add to MetaCart
Abstract. For a positive integer k, the rankk numerical range Λk(A) of an operator A acting on a Hilbert space H of dimension at least k is the set of scalars λ such that PAP = λP for some rank k orthogonal projection P. In this paper, a close connection between low rank perturbation
Results 1  10
of
1,245,655