Results 1  10
of
12
Nonasymptotic theory of random matrices: extreme singular values
 PROCEEDINGS OF THE INTERNATIONAL CONGRESS OF MATHEMATICIANS
, 2010
"... ..."
Random matrices: The distribution of the smallest singular values
, 2009
"... Let ξ be a realvalued random variable of mean zero and variance 1. Let Mn(ξ) denote the n × n random matrix whose entries are iid copies of ξ and σn(Mn(ξ)) denote the least singular value of Mn(ξ). The quantity σn(Mn(ξ)) 2 is thus the least eigenvalue of the Wishart matrix MnM ∗ n. We show that ( ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Let ξ be a realvalued random variable of mean zero and variance 1. Let Mn(ξ) denote the n × n random matrix whose entries are iid copies of ξ and σn(Mn(ξ)) denote the least singular value of Mn(ξ). The quantity σn(Mn(ξ)) 2 is thus the least eigenvalue of the Wishart matrix MnM ∗ n. We show that (under a finite moment assumption) the probability distribution nσn(Mn(ξ)) 2 is universal in the sense that it does not depend on the distribution of ξ. In particular, it converges to the same limiting distribution as in the special case when ξ is real gaussian. (The limiting distribution was computed explicitly in this case by Edelman.) We also proved a similar result for complexvalued random variables of mean zero, with real and imaginary parts having variance 1/2 and covariance zero. Similar results are also obtained for the joint distribution of the bottom k singular values of Mn(ξ) for any fixed k (or even for k growing as a small power of n) and for rectangular matrices. Our approach is motivated by the general idea of “property testing ” from combinatorics and theoretical computer science. This seems to be a new approach in the study of spectra of random matrices and combines tools from various areas of mathematics
Intersection Bounds: Estimation and Inference
, 2010
"... We develop a practical and novel method for inference on intersection bounds, namely bounds defined by either the infimum or supremum of a parametric or nonparametric function, or equivalently, the value of a linear programming problem with a potentially infinite constraint set. Our approach is es ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
We develop a practical and novel method for inference on intersection bounds, namely bounds defined by either the infimum or supremum of a parametric or nonparametric function, or equivalently, the value of a linear programming problem with a potentially infinite constraint set. Our approach is especially convenient in models comprised of a continuum of inequalities that are separable in parameters, and also applies to models with inequalities that are nonseparable in parameters. Since analog estimators for intersection bounds can be severely biased in finite samples, routinely underestimating the length of the identified set, we also offer a (downward/upward) median unbiased estimator of these (upper/lower) bounds as a natural byproduct of our inferential procedure. Furthermore, our method appears to be the first and currently only method for inference in nonparametric models with a continuum of inequalities. We develop asymptotic theory for our method based on the strong approximation of a sequence of studentized empirical processes by a sequence of Gaussian or other pivotal processes. We provide conditions for the use of nonparametric kernel and series estimators, including a novel result that establishes strong approximation for general series estimators, which may be of independent interest. We illustrate the usefulness of our method with Monte Carlo experiments and an empirical example.
Circular law and arc law for truncation of random unitary matrix
 Journal of Mathematical Physics
"... Let V be the m × m upperleft corner of an n × n Haarinvariant unitary matrix. Let λ1, · · · , λm be the eigenvalues of V. We prove that the empirical distribution of a normalization of λ1, · · · , λm goes to the circular law, that is, the uniform distribution on {z ∈ C; z  ≤ 1} as m → ∞ ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Let V be the m × m upperleft corner of an n × n Haarinvariant unitary matrix. Let λ1, · · · , λm be the eigenvalues of V. We prove that the empirical distribution of a normalization of λ1, · · · , λm goes to the circular law, that is, the uniform distribution on {z ∈ C; z  ≤ 1} as m → ∞ with m/n → 0. We also prove that the empirical distribution of λ1, · · · , λm goes to the arc law, that is, the uniform distribution on {z ∈ C; z  = 1} as m/n → 1. These explain two observations by ˙ Zyczkowski and Sommers (2000).
Zeta Functions and Chaos
, 2009
"... Abstract: The zeta functions of Riemann, Selberg and Ruelle are briefly introduced along with some others. The Ihara zeta function of a finite graph is our main topic. We consider two determinant formulas for the Ihara zeta, the Riemann hypothesis, and connections with random matrix theory and quant ..."
Abstract
 Add to MetaCart
Abstract: The zeta functions of Riemann, Selberg and Ruelle are briefly introduced along with some others. The Ihara zeta function of a finite graph is our main topic. We consider two determinant formulas for the Ihara zeta, the Riemann hypothesis, and connections with random matrix theory and quantum chaos. 1
Additive Combinatorics with a view towards Computer Science and Cryptography An
, 2011
"... Recently, additive combinatorics has blossomed into a vibrant area in mathematical sciences. But it seems to be a difficult area to define – perhaps because of a blend of ideas and techniques from several seemingly unrelated contexts which are used there. One might say that additive combinatorics is ..."
Abstract
 Add to MetaCart
Recently, additive combinatorics has blossomed into a vibrant area in mathematical sciences. But it seems to be a difficult area to define – perhaps because of a blend of ideas and techniques from several seemingly unrelated contexts which are used there. One might say that additive combinatorics is a branch of mathematics concerning the study of additive structures in sets equipped with a group structure – we may have other structure that interacts with this group structure. This newly emerging field has seen tremendous advances over the last few years, and has recently become a focus of attention among both mathematicians and computer scientists. This fascinating area has been enriched by its formidable links to combinatorics, number theory, harmonic analysis, ergodic theory, and some other branches; all deeply crossfertilize each other, holding great promise for all of them! There is a considerable number of incredible problems, results, and novel applications in this thriving area. In this exposition, we attempt to provide an illuminating overview of some conspicuous breakthroughs in this captivating field, together with a number of seminal applications to sundry parts of mathematics and some other disciplines, with emphasis on computer science and cryptography.
Invertibility of symmetric random matrices
, 2011
"... We study n × n symmetric random matrices H, possibly discrete, with iid abovediagonal entries. We show that H is singular with probability at most exp(−nc), and ‖H−1 ‖ = O ( √ n). Furthermore, the spectrum of H is delocalized on the optimal scale o(n−1/2). These results improve upon a polynomial ..."
Abstract
 Add to MetaCart
We study n × n symmetric random matrices H, possibly discrete, with iid abovediagonal entries. We show that H is singular with probability at most exp(−nc), and ‖H−1 ‖ = O ( √ n). Furthermore, the spectrum of H is delocalized on the optimal scale o(n−1/2). These results improve upon a polynomial singularity bound due to Costello, Tao and Vu, and they generalize, up to constant factors, results of Tao and Vu, and Erdös, Schlein
Here
, 2010
"... Given a nite positive measure on the real line, with in nitely many points in its support, we can de ne orthonormal polynomials fpng 1 n=0 satisfying, for all m; n 0; Z pnpmd = mn: ..."
Abstract
 Add to MetaCart
Given a nite positive measure on the real line, with in nitely many points in its support, we can de ne orthonormal polynomials fpng 1 n=0 satisfying, for all m; n 0; Z pnpmd = mn: