Results 1  10
of
31
A universality result for the smallest eigenvalues of certain sample covariance matrices
, 2009
"... ..."
Fluctuations of the extreme eigenvalues of finite rank deformations of random matrices
 Electron. J. Prob
, 2011
"... Abstract. Consider a deterministic selfadjoint matrix Xn with spectral measure converging to a compactly supported probability measure, the largest and smallest eigenvalues converging to the edges of the limiting measure. We perturb this matrix by adding a random finite rank matrix with delocalised ..."
Abstract

Cited by 44 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Consider a deterministic selfadjoint matrix Xn with spectral measure converging to a compactly supported probability measure, the largest and smallest eigenvalues converging to the edges of the limiting measure. We perturb this matrix by adding a random finite rank matrix with delocalised eigenvectors and study the extreme eigenvalues of the deformed model. We give necessary conditions on the deterministic matrix Xn so that the eigenvalues converging out of the bulk exhibit Gaussian fluctuations, whereas the eigenvalues sticking to the edges are very close to the eigenvalues of the nonperturbed model and fluctuate in the same scale. We generalize these results to the case when Xn is random and get similar behavior when we deform some classical models such as Wigner or Wishart matrices with rather general entries or the socalled matrix models.
Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices
 Ann. Stat
, 2011
"... Testing covariance structure is of significant interest in many areas of statistical analysis and construction of compressed sensing matrices is an important problem in signal processing. Motivated by these applications, we study in this paper the limiting laws of the coherence of an n×p random matr ..."
Abstract

Cited by 30 (10 self)
 Add to MetaCart
Testing covariance structure is of significant interest in many areas of statistical analysis and construction of compressed sensing matrices is an important problem in signal processing. Motivated by these applications, we study in this paper the limiting laws of the coherence of an n×p random matrix in the highdimensional setting where p can be much larger than n. Both the law of large numbers and the limiting distribution are derived. We then consider testing the bandedness of the covariance matrix of a high dimensional Gaussian distribution which includes testing for independence as a special case. The limiting laws of the coherence of the data matrix play a critical role in the construction of the test. We also apply the asymptotic results to the construction of compressed sensing matrices.
The largest eigenvalues of sample covariance matrices for a spiked population: diagonal case.
, 2008
"... ..."
Limits of spiked random matrices
, 2013
"... Given a large, highdimensional sample from a spiked population, the top sample covariance eigenvalue is known to exhibit a phase transition. We show that the largest eigenvalues have asymptotic distributions near the phase transition in the rank one spiked real Wishart setting and its general β ana ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
Given a large, highdimensional sample from a spiked population, the top sample covariance eigenvalue is known to exhibit a phase transition. We show that the largest eigenvalues have asymptotic distributions near the phase transition in the rank one spiked real Wishart setting and its general β analogue, proving a conjecture of Baik, Ben Arous and Péche ́ (2005). We also treat shifted mean Gaussian orthogonal and β ensembles. Such results are entirely new in the real case; in the complex case we strengthen existing results by providing optimal scaling assumptions. One obtains the known limiting random Schrödinger operator on the halfline, but the boundary condition now depends on the perturbation. We derive several characterizations of the limit laws in which β appears as a parameter, including a simple linear boundary value problem. This PDE description recovers known explicit formulas at β = 2, 4, yielding in particular a new and simple proof of the Painleve ́ representations for these
Factor Modeling for HighDimensional Time Series: Inference for the Number of Factors ∗
"... This paper deals with the factor modeling for highdimensional time series based on a dimensionreduction viewpoint. Under stationary settings, the inference is simple in the sense that both the number of factors and the factor loadings are estimated in terms of an eigenanalysis for a nonnegative d ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
(Show Context)
This paper deals with the factor modeling for highdimensional time series based on a dimensionreduction viewpoint. Under stationary settings, the inference is simple in the sense that both the number of factors and the factor loadings are estimated in terms of an eigenanalysis for a nonnegative definite matrix, and is therefore applicable when the dimension of time series is in the order of a few thousands. Asymptotic properties of the proposed method are investigated under two settings: (i) the sample size goes to infinity while the dimension of time series is fixed; and (ii) both the sample size and the dimension of time series go to infinity together. In particular, our estimators for zeroeigenvalues enjoy faster convergence (or slower divergence) rates, hence making the estimation for the number of factors easier. In particular when the sample size and the dimension of time series go to infinity together, the estimators for the eigenvalues are no longer consistent. However our estimator for the number of the factors, which is based on the ratios of the estimated eigenvalues, still works fine. Furthermore, this estimation shows the socalled ‘blessing of dimensionality ’ property in the sense that the performance of the estimation may improve when the dimension of time series increases. A twostep procedure is investigated when the factors are of different degrees of strength. Numerical illustration with both simulated and real data is also reported. Key words and phrases. Autocovariance matrices, blessing of dimensionality, eigenanalysis, fast convergence rates, multivariate time series, ratiobased estimator, strength of factors, white noise.
On the distribution of the ratio of the largest eigenvalue to the trace of a Wishart Matrix, in preparation
, 2010
"... The ratio of the largest eigenvalue divided by the trace of a p × p random Wishart matrix with n degrees of freedom and identity covariance matrix plays an important role in various hypothesis testing problems, both in statistics and in signal processing. In this paper we derive an approximate expli ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
The ratio of the largest eigenvalue divided by the trace of a p × p random Wishart matrix with n degrees of freedom and identity covariance matrix plays an important role in various hypothesis testing problems, both in statistics and in signal processing. In this paper we derive an approximate explicit expression for the distribution of this ratio, by considering the joint limit as both p, n → ∞ with p/n → c. Our analysis reveals that even though asymptotically in this limit the ratio follows a TracyWidom (TW) distribution, one of the leading error terms depends on the second derivative of the TW distribution, and is nonnegligible for practical values of p, in particular for determining tail probabilities. We thus propose to explicitly include this term in the approximate distribution for the ratio. We illustrate empirically using simulations that adding this term to the TW distribution yields a quite accurate expression to the empirical distribution of the ratio, even for small values of p, n. 1
TracyWidom law for the extreme eigenvalues of sample correlation matrices
"... E l e c t r o n ..."
(Show Context)