Results 1  10
of
73
Logarithmic concave measures with application to . . .
 ACTA SCIENTIARUM MATHEMATICARUM, 32 (1971), PP. 301–316.
, 1971
"... ..."
Concentration of the Spectral Measure for Large Matrices
, 2000
"... We derive concentration inequalities for functions of the empirical measure of eigenvalues for large, random, self adjoint matrices, with not necessarily Gaussian entries. The results presented apply in particular to nonGaussian Wigner and Wishart matrices. We also provide concentration bounds for ..."
Abstract

Cited by 65 (11 self)
 Add to MetaCart
We derive concentration inequalities for functions of the empirical measure of eigenvalues for large, random, self adjoint matrices, with not necessarily Gaussian entries. The results presented apply in particular to nonGaussian Wigner and Wishart matrices. We also provide concentration bounds for non commutative functionals of random matrices. 1 Introduction and statement of results Consider a random N N Hermitian matrix X with i.i.d. complex entries (except for the symmetry constraint) satisfying a moment condition. It is well known since Wigner [28] that the spectral measure of N 1=2 X converges to the semicircle law. This observation has been generalized to a large class of matrices, e.g. sample covariance matrices of the form XRX where R is a deterministic diagonal matrix ([19]), band matrices (see [5, 16, 20]), etc. For the Wigner case, this convergence has been supplemented by Central Limit Theorems, see [15] for the case of Gaussian entries and [17], [22] for the gen...
Algebraic factor analysis: tetrads, pentads and beyond
"... Factor analysis refers to a statistical model in which observed variables are conditionally independent given fewer hidden variables, known as factors, and all the random variables follow a multivariate normal distribution. The parameter space of a factor analysis model is a subset of the cone of po ..."
Abstract

Cited by 28 (12 self)
 Add to MetaCart
Factor analysis refers to a statistical model in which observed variables are conditionally independent given fewer hidden variables, known as factors, and all the random variables follow a multivariate normal distribution. The parameter space of a factor analysis model is a subset of the cone of positive definite matrices. This parameter space is studied from the perspective of computational algebraic geometry. Gröbner bases and resultants are applied to compute the ideal of all polynomial functions that vanish on the parameter space. These polynomials, known as model invariants, arise from rank conditions on a symmetric matrix under elimination of the diagonal entries of the matrix. Besides revealing the geometry of the factor analysis model, the model invariants also furnish useful statistics for testing goodnessoffit. 1
High dimensional statistical inference and random matrices
 IN: PROCEEDINGS OF INTERNATIONAL CONGRESS OF MATHEMATICIANS
, 2006
"... Multivariate statistical analysis is concerned with observations on several variables which are thought to possess some degree of interdependence. Driven by problems in genetics and the social sciences, it first flowered in the earlier half of the last century. Subsequently, random matrix theory ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
Multivariate statistical analysis is concerned with observations on several variables which are thought to possess some degree of interdependence. Driven by problems in genetics and the social sciences, it first flowered in the earlier half of the last century. Subsequently, random matrix theory (RMT) developed, initially within physics, and more recently widely in mathematics. While some of the central objects of study in RMT are identical to those of multivariate statistics, statistical theory was slow to exploit the connection. However, with vast data collection ever more common, data sets now often have as many or more variables than the number of individuals observed. In such contexts, the techniques and results of RMT have much to offer multivariate statistics. The paper reviews some of the progress to date.
Orthogonal polynomial ensembles in probability theory
 Prob. Surv
, 2005
"... Abstract: We survey a number of models from physics, statistical mechanics, probability theory and combinatorics, which are each described in terms of an orthogonal polynomial ensemble. The most prominent example is apparently the Hermite ensemble, the eigenvalue distribution of the Gaussian Unitary ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Abstract: We survey a number of models from physics, statistical mechanics, probability theory and combinatorics, which are each described in terms of an orthogonal polynomial ensemble. The most prominent example is apparently the Hermite ensemble, the eigenvalue distribution of the Gaussian Unitary Ensemble (GUE), and other wellknown ensembles known in random matrix theory like the Laguerre ensemble for the spectrum of Wishart matrices. In recent years, a number of further interesting models were found to lead to orthogonal polynomial ensembles, among which the corner growth model, directed last passage percolation, the PNG droplet, noncolliding random processes, the length of the longest increasing subsequence of a random permutation, and others. Much attention has been paid to universal classes of asymptotic behaviors of these models in the limit of large particle numbers, in particular the spacings between the particles and the fluctuation behavior of the largest particle. Computer simulations suggest that the connections go even farther
Developments in random matrix theory
 J. Phys. A: Math. Gen
, 2000
"... In this preface to the Journal of Physics A, Special Edition on Random Matrix Theory, we give a review of the main historical developments of random matrix theory. A short summary of the papers that appear in this special edition is also given. 1 1 ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
In this preface to the Journal of Physics A, Special Edition on Random Matrix Theory, we give a review of the main historical developments of random matrix theory. A short summary of the papers that appear in this special edition is also given. 1 1
How to generate random matrices from the classical compact groups. Notices to the AMS
, 2007
"... found applications in a variety of areas of physics, pure and applied mathematics, probability, statistics, and engineering. A few examples—far from being exhaustive— include: analytic number theory, combinatorics, graph theory, multivariate statistics, nuclear physics, quantum chaos, quantum inform ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
found applications in a variety of areas of physics, pure and applied mathematics, probability, statistics, and engineering. A few examples—far from being exhaustive— include: analytic number theory, combinatorics, graph theory, multivariate statistics, nuclear physics, quantum chaos, quantum information, statistical mechanics, structural dynamics, and wireless telecommunications. The reasons for the ever growing success of RMT are mainly two. Firstly, in the limit of large matrix dimension the statistical correlations of the spectra of a family, or ensemble, of matrices are independent of the probability distribution that defines the ensemble, but depend only on the invariant properties of such a distribution. As a consequence random matrices turn out to be very accurate models for a large number of mathematical and physical problems. Secondly, RMT techniques allow analytical computations to an extent that is often impossible to achieve in the contexts that they are modelling. This predictive ability of RMT is particularly powerful whenever in the original problem there are no natural parameters to average over. Although the advantage of using RMT lies in the possibility of computing explicit mathematical and physical quantities analytically, it is sometimes necessary to resort to numerical simulations. The purpose of this article is twofold. Firstly, we provide the reader with a simple method for generating random matrices from the classical compact groups that most mathematicians—not necessarily familiar with computer programming —should Francesco Mezzadri is a Lecturer in Applied Mathematics