Results 1  10
of
38
Gaussian processes: inequalities, small ball probabilities and applications. Stochastic processes: theory and methods
 Handbook of Statist
, 2001
"... Edited by C.R. Rao and D. Shanbhag. 1 ..."
Approximation, Metric Entropy and Small Ball Estimates for Gaussian Measures
 Ann. Probab
, 1999
"... A precise link proved by J. Kuelbs and W. V. Li relates the small ball behavior of a Gaussian measure on a Banach space E with the metric entropy behavior of K , the unit ball of the RKHS of in E. We remove the main regularity assumption imposed on the unknown function in the link. This enables t ..."
Abstract

Cited by 42 (20 self)
 Add to MetaCart
A precise link proved by J. Kuelbs and W. V. Li relates the small ball behavior of a Gaussian measure on a Banach space E with the metric entropy behavior of K , the unit ball of the RKHS of in E. We remove the main regularity assumption imposed on the unknown function in the link. This enables the application of tools and results from functional analysis to small ball problems and leads to small ball estimates of general algebraic type as well as to new estimates for concrete Gaussian processes. Moreover, we show that the small ball behavior of a Gaussian process is also tightly connected with the speed of approximation by "nite rank" processes. Abbreviated title: Metric Entropy and Small Ball Estimates Keywords: Gaussian process, small deviation, metric entropy, approximation number. AMS 1991 Subject Classications: Primary: 60G15 ; Secondary: 60F99, 47D50, 47G10 . 1 Supported in part by NSF 1 1 Introduction Let denote a centered Gaussian measure on a real separable B...
Concentration Inequalities Using the Entropy Method
, 2002
"... We investigate a new methodology... The main purpose of this paper is to point out the simplicity and the generality of the approach. We show how the new method can recover many of Talagrand's revolutionary inequalities and provide new applications in a variety of problems including Rademacher ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
We investigate a new methodology... The main purpose of this paper is to point out the simplicity and the generality of the approach. We show how the new method can recover many of Talagrand's revolutionary inequalities and provide new applications in a variety of problems including Rademacher averages, Rademacher chaos, the number of certain small subgraphs in a random graph, and the minimum of the empirical risk in some statistical estimation problems.
Concentration inequalities
 Advanced Lectures in Machine Learning
, 2004
"... Abstract. Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical a ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
Abstract. Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical analysis of various problems in machine learning and made it possible to derive new efficient algorithms. This text attempts to summarize some of the basic tools. 1
Concentration and Deviation Inequalities in Infinite Dimensions via Covariance Representations
, 2002
"... Concentration and deviation inequalities are obtained for functionals on Wiener space, Poisson space or more generally for normal martingales and binomial processes. The method used here is based on covariance identities obtained via the chaotic representation property, and provides an alternative ..."
Abstract

Cited by 23 (11 self)
 Add to MetaCart
Concentration and deviation inequalities are obtained for functionals on Wiener space, Poisson space or more generally for normal martingales and binomial processes. The method used here is based on covariance identities obtained via the chaotic representation property, and provides an alternative to the use of logarithmic Sobolev inequalities. It allows to recover known concentration and deviation inequalities on the Wiener and Poisson space (including the ones given by sharp logarithmic Sobolev inequalities), and extends results available in the discrete case, i.e. on the infinite cube {1; 1}∞.
Differential equations driven by Gaussian signals
, 2007
"... We consider multidimensional Gaussian processes and give a new condition on the covariance, simple and sharp, for the existence of Lévy area(s). Gaussian rough paths are constructed with a variety of weak and strong approximation results. Together with a new RKHS embedding, we obtain a powerful ye ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
We consider multidimensional Gaussian processes and give a new condition on the covariance, simple and sharp, for the existence of Lévy area(s). Gaussian rough paths are constructed with a variety of weak and strong approximation results. Together with a new RKHS embedding, we obtain a powerful yet conceptually simple framework in which to analysize differential equations driven by Gaussian signals in the rough paths sense. 1
A Sharp Concentration Inequality With Applications
, 1999
"... We derive a new general concentrationofmeasure inequality. The concentration inequality applies, among others, to configuration functions as defined by Talagrand and also to combinatorial entropies such as the logarithm of the number of increasing subsequences in a random permutation and to Va ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We derive a new general concentrationofmeasure inequality. The concentration inequality applies, among others, to configuration functions as defined by Talagrand and also to combinatorial entropies such as the logarithm of the number of increasing subsequences in a random permutation and to VapnikChervonenkis (vc) entropies. The results find direct applications in statistical learning theory, substantiating the possibility to use the empirical vcentropy in penalization techniques.
Instance optimal decoding by thresholding in compressed sensing
"... Compressed Sensing seeks to capture a discrete signal x ∈ IR N with a small number n of linear measurements. The information captured about x from such measurements is given by the vector y = Φx ∈ IR n where Φ is an n × N matrix. The best matrices, from the viewpoint of capturing sparse or compressi ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Compressed Sensing seeks to capture a discrete signal x ∈ IR N with a small number n of linear measurements. The information captured about x from such measurements is given by the vector y = Φx ∈ IR n where Φ is an n × N matrix. The best matrices, from the viewpoint of capturing sparse or compressible signals, are generated by random processes, e.g. their entries are given by i.i.d. Bernoulli or Gaussian random variables. The information y holds about x is extracted by a decoder ∆ mapping IR n into IR N. Typical decoders are based on ℓ1minimization and greedy pursuit. The present paper studies the performance of decoders based on thresholding. For quite general random families of matrices Φ, decoders ∆ are constructed which are instanceoptimal in probability by which we mean the following. If x is any vector in IR N, then with high probability applying ∆ to y = Φx gives a vector ¯x: = ∆(y) such that ‖x−¯x ‖ ≤ C0σk(x)ℓ2 for all k ≤ an / log N provided a is sufficiently small (depending on the probability of failure). Here σk(x)ℓ2 is the error that results when x is approximated by the k sparse vector which equals x in its k largest coordinates and is otherwise zero. It is also shown that results of this type continue to hold even if the measurement vector y is corrupted by additive noise: y = Φx + e where e is some noise vector. In this case σk(x)ℓ2 is replaced by σk(x)ℓ2 + ‖e‖ℓ2.
A Gaussian Correlation Inequality And Its Applications To Small Ball Probabilities
, 1999
"... : We present a Gaussian correlation inequality which is closely related to a result of Schechtman, Schlumprecht and Zinn (1998) on the wellknown Gaussian correlation conjecture. The usefulness of the inequality is demonstrated by several important applications to the estimates of small ball probabi ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
: We present a Gaussian correlation inequality which is closely related to a result of Schechtman, Schlumprecht and Zinn (1998) on the wellknown Gaussian correlation conjecture. The usefulness of the inequality is demonstrated by several important applications to the estimates of small ball probability. 1 Introduction The wellknown Gaussian correlation conjecture states that for any two symmetric convex sets A and B in a separable Banach space E and for any centered Gaussian measure on E, (A # B) # (A)(B). (1.1) Various equivalent formulations, early history and recent progresses of the conjecture can be found in Schechtman, Schlumprecht and Zinn (1998). A special case of the conjecture, when one of the symmetric convex set is a slab of the form {x # E : f # (x)#1} for some linear functional f # in the dual of E, was proved by Khatri (1967) and Sidak (1968) independently. The Khatri Sidak result has many applications in probability and statistics (see Tong (1980)). Re...