Results 1  10
of
80,573
Concentration for selfbounding functions and an inequality of Talagrand
 RANDOM STRUCTURES AND ALGORITHMS
, 2006
"... We see that the entropy method yields strong concentration results for general selfbounding functions of independent random variables. These give an improvement of a concentration result of Talagrand much used in discrete mathematics. ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
We see that the entropy method yields strong concentration results for general selfbounding functions of independent random variables. These give an improvement of a concentration result of Talagrand much used in discrete mathematics.
On concentration of selfbounding functions
, 2009
"... We prove some new concentration inequalities for selfbounding functions using the entropy method. As an application, we recover Talagrand’s convex distance inequality. The new Bernsteinlike inequalities for selfbounding functions are derived thanks to a careful analysis of the socalled Herbst ar ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
We prove some new concentration inequalities for selfbounding functions using the entropy method. As an application, we recover Talagrand’s convex distance inequality. The new Bernsteinlike inequalities for selfbounding functions are derived thanks to a careful analysis of the socalled Herbst
Generalization Of An Inequality By Talagrand, And Links With The Logarithmic Sobolev Inequality
 J. Funct. Anal
, 2000
"... . We show that transport inequalities, similar to the one derived by Talagrand [30] for the Gaussian measure, are implied by logarithmic Sobolev inequalities. Conversely, Talagrand's inequality implies a logarithmic Sobolev inequality if the density of the measure is approximately logconcave, ..."
Abstract

Cited by 248 (13 self)
 Add to MetaCart
. We show that transport inequalities, similar to the one derived by Talagrand [30] for the Gaussian measure, are implied by logarithmic Sobolev inequalities. Conversely, Talagrand's inequality implies a logarithmic Sobolev inequality if the density of the measure is approximately log
Concentration Of Measure And Isoperimetric Inequalities In Product Spaces
, 1995
"... . The concentration of measure phenomenon in product spaces roughly states that, if a set A in a product# N of probability spaces has measure at least one half, "most" of the points of# N are "close" to A. We proceed to a systematic exploration of this phenomenon. The meaning ..."
Abstract

Cited by 383 (4 self)
 Add to MetaCart
. The meaning of the word "most" is made rigorous by isoperimetrictype inequalities that bound the measure of the exceptional sets. The meaning of the work "close" is defined in three main ways, each of them giving rise to related, but di#erent inequalities. The inequalities are all proved
Computing Inequality: Have Computers Changed the Labor Market?”Quarterly
 Journal of Economics
, 1998
"... This paper examines the effect of skillbiased technological change as measured by computerization on the recent widening of U. S. educational wage differentials. An analysis of aggregate changes in the relative supplies and wages of workers by education from 1940 to 1996 indicates strong and persis ..."
Abstract

Cited by 473 (18 self)
 Add to MetaCart
This paper examines the effect of skillbiased technological change as measured by computerization on the recent widening of U. S. educational wage differentials. An analysis of aggregate changes in the relative supplies and wages of workers by education from 1940 to 1996 indicates strong and persistent growth in relative demand favoring college graduates. Rapid skill upgrading within detailed industries accounts for most of the growth in the relative demand for college workers, particularly since 1970. Analyses of four data sets indicate that the rate of skill upgrading has been greater in more computerintensive industries. I.
Nearly Tight Bounds on `1 Approximation of SelfBounding Functions
"... We study the complexity of learning and approximation of selfbounding functions over the uniform distribution on the Boolean hypercube {0, 1}n. Informally, a function f: {0, 1}n → R is selfbounding if for every x ∈ {0, 1}n, f(x) upper bounds the sum of all the n marginal decreases in the value of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
of the function at x. Selfbounding functions include such wellknown classes of functions as submodular and fractionallysubadditive (XOS) functions. They were introduced by Boucheron et al. in the context of concentration of measure inequalities [BLM00]. Our main result is a nearly tight `1approximation
Convex Analysis
, 1970
"... In this book we aim to present, in a unified framework, a broad spectrum of mathematical theory that has grown in connection with the study of problems of optimization, equilibrium, control, and stability of linear and nonlinear systems. The title Variational Analysis reflects this breadth. For a lo ..."
Abstract

Cited by 5350 (67 self)
 Add to MetaCart
long time, ‘variational ’ problems have been identified mostly with the ‘calculus of variations’. In that venerable subject, built around the minimization of integral functionals, constraints were relatively simple and much of the focus was on infinitedimensional function spaces. A major theme
Nearly Tight Bounds on ℓ1 Approximation of SelfBounding Functions
"... We study the complexity of learning and approximation of selfbounding functions over the uniform distribution on the Boolean hypercube {0, 1}n. Informally, a function f: {0, 1}n → R is selfbounding if for every x ∈ {0, 1}n, f(x) upper bounds the sum of all the n marginal decreases in the value of ..."
Abstract
 Add to MetaCart
of the function at x. Selfbounding functions include such wellknown classes of functions as submodular and fractionallysubadditive (XOS) functions. They were introduced by Boucheron et al. in the context of concentration of measure inequalities [BLM00]. Our main result is a nearly tight `1approximation
Inducing Features of Random Fields
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 1997
"... We present a technique for constructing random fields from a set of training samples. The learning paradigm builds increasingly complex fields by allowing potential functions, or features, that are supported by increasingly large subgraphs. Each feature has a weight that is trained by minimizing the ..."
Abstract

Cited by 664 (14 self)
 Add to MetaCart
We present a technique for constructing random fields from a set of training samples. The learning paradigm builds increasingly complex fields by allowing potential functions, or features, that are supported by increasingly large subgraphs. Each feature has a weight that is trained by minimizing
Singularity Detection And Processing With Wavelets
 IEEE Transactions on Information Theory
, 1992
"... Most of a signal information is often found in irregular structures and transient phenomena. We review the mathematical characterization of singularities with Lipschitz exponents. The main theorems that estimate local Lipschitz exponents of functions, from the evolution across scales of their wavele ..."
Abstract

Cited by 590 (13 self)
 Add to MetaCart
Most of a signal information is often found in irregular structures and transient phenomena. We review the mathematical characterization of singularities with Lipschitz exponents. The main theorems that estimate local Lipschitz exponents of functions, from the evolution across scales
Results 1  10
of
80,573