Results 1  10
of
133
The BrunnMinkowski inequality
 BULL. AMER. MATH. SOC. (N.S
, 2002
"... In 1978, Osserman [124] wrote an extensive survey on the isoperimetric inequality. The BrunnMinkowski inequality can be proved in a page, yet quickly yields the classical isoperimetric inequality for important classes of subsets of R n, and deserves to be better known. This guide explains the rela ..."
Abstract

Cited by 178 (9 self)
 Add to MetaCart
(Show Context)
In 1978, Osserman [124] wrote an extensive survey on the isoperimetric inequality. The BrunnMinkowski inequality can be proved in a page, yet quickly yields the classical isoperimetric inequality for important classes of subsets of R n, and deserves to be better known. This guide explains the relationship between the BrunnMinkowski inequality and other inequalities in geometry and analysis, and some applications.
Solving convex programs by random walks
 Journal of the ACM
, 2002
"... Minimizing a convex function over a convex set in ndimensional space is a basic, general problem with many interesting special cases. Here, we present a simple new algorithm for convex optimization based on sampling by a random walk. It extends naturally to minimizing quasiconvex functions and to ..."
Abstract

Cited by 72 (11 self)
 Add to MetaCart
Minimizing a convex function over a convex set in ndimensional space is a basic, general problem with many interesting special cases. Here, we present a simple new algorithm for convex optimization based on sampling by a random walk. It extends naturally to minimizing quasiconvex functions and to other generalizations.
HitandRun Mixes Fast
 MATH. PROG
, 1998
"... It is shown that the "hitandrun" algorithm for sampling from a convex body K (introduced by R.L. Smith) mixes in time O # (n 2 R 2 /r 2 ), where R and r are the radii of the inscribed and circumscribed balls of K. Thus after appropriate preprocessing, hitandrun produces an approx ..."
Abstract

Cited by 68 (7 self)
 Add to MetaCart
(Show Context)
It is shown that the "hitandrun" algorithm for sampling from a convex body K (introduced by R.L. Smith) mixes in time O # (n 2 R 2 /r 2 ), where R and r are the radii of the inscribed and circumscribed balls of K. Thus after appropriate preprocessing, hitandrun produces an approximately uniformly distributed sample point in time O # (n 3 ), which matches the best known bound for other sampling algorithms. We show that the bound is best possible in terms of R, r and n.
HitandRun from a Corner
 STOC'04
, 2004
"... We show that the hitandrun random walk mixes rapidly starting from any interior point of a convex body. This is the first random walk known to have this property. In contrast, the ball walk can take exponentially many steps from some starting points. ..."
Abstract

Cited by 65 (8 self)
 Add to MetaCart
We show that the hitandrun random walk mixes rapidly starting from any interior point of a convex body. This is the first random walk known to have this property. In contrast, the ball walk can take exponentially many steps from some starting points.
Concentration of mass on convex bodies,
 Geom. Funct. Anal.,
, 2006
"... Abstract We establish a sharp concentration of mass inequality for isotropic convex bodies: there exists an absolute constant c > 0 such that if K is an isotropic convex body in R n , then for every t 1, where LK denotes the isotropic constant. ..."
Abstract

Cited by 61 (4 self)
 Add to MetaCart
(Show Context)
Abstract We establish a sharp concentration of mass inequality for isotropic convex bodies: there exists an absolute constant c > 0 such that if K is an isotropic convex body in R n , then for every t 1, where LK denotes the isotropic constant.
Faster Mixing via average Conductance
, 1999
"... The notion of conductance introduced by Jerrum and Sinclair [JS] has been widely used to prove rapid mixing of Markov chains. Here we introduce a variant of this  instead of measuring the conductance of the worst subset of states, we show that it is enough to bound a certain weighted average conduc ..."
Abstract

Cited by 57 (3 self)
 Add to MetaCart
The notion of conductance introduced by Jerrum and Sinclair [JS] has been widely used to prove rapid mixing of Markov chains. Here we introduce a variant of this  instead of measuring the conductance of the worst subset of states, we show that it is enough to bound a certain weighted average conductance (where the average is taken over subsets of states with different sizes.) In the case of convex bodies, we show that this average conductance is better than the known bounds for the worst case; this helps us save a factor of O(n) which is incurred in all proofs as a "penalty" for a "bad start" (i.e., because the starting distribution may be arbitrary).
A Spectral Algorithm for Learning Mixtures of Distributions
 Journal of Computer and System Sciences
, 2002
"... We show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in R works remarkably well  it succeeds in identifying the Gaussians assuming essentially the minimum possible separation between their centers that keeps them unique (solving an open problem of [1]). The ..."
Abstract

Cited by 57 (4 self)
 Add to MetaCart
(Show Context)
We show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in R works remarkably well  it succeeds in identifying the Gaussians assuming essentially the minimum possible separation between their centers that keeps them unique (solving an open problem of [1]). The sample complexity and running time are polynomial in both n and k. The algorithm also works for the more general problem of learning a mixture of "weakly isotropic" distributions (e.g. a mixture of uniform distributions on cubes).
An Improved WorstCase to AverageCase Connection for Lattice Problems (extended abstract)
 In FOCS
, 1997
"... We improve a connection of the worstcase complexity and the averagecase complexity of some wellknown lattice problems. This fascinating connection was first discovered by Ajtai [1] in 1996. We improve the exponent of this connection from 8 to 3:5 + ffl. Department of Computer Science, State Unive ..."
Abstract

Cited by 55 (10 self)
 Add to MetaCart
(Show Context)
We improve a connection of the worstcase complexity and the averagecase complexity of some wellknown lattice problems. This fascinating connection was first discovered by Ajtai [1] in 1996. We improve the exponent of this connection from 8 to 3:5 + ffl. Department of Computer Science, State University of New York at Buffalo, Buffalo, NY 14260. Research supported in part by NSF grants CCR9319393 and CCR9634665, and an Alfred P. Sloan Fellowship. Email: cai@cs.buffalo.edu y Department of Computer Science, State University of New York at Buffalo, Buffalo, NY 14260. Research supported in part by NSF grants CCR9319393 and CCR9634665. Email: apn@cs.buffalo.edu 1 Introduction A lattice L is a discrete additive subgroup of R n . There are many fascinating problems concerning lattices, both from a structural and from an algorithmic point of view [12, 20, 11, 13]. The study of lattice problems can be traced back to Gauss, Dirichlet and Hermite, among others [8, 6, 14]. The subje...
On the role of convexity in isoperimetry, spectralgap and concentration
 Invent. Math
"... We show that for convex domains in Euclidean space, Cheeger’s isoperimetric inequality, spectral gap of the Neumann Laplacian, exponential concentration of Lipschitz functions, and the apriori weakest requirement that Lipschitz functions have arbitrarily slow uniform taildecay, are all quantitativ ..."
Abstract

Cited by 45 (12 self)
 Add to MetaCart
(Show Context)
We show that for convex domains in Euclidean space, Cheeger’s isoperimetric inequality, spectral gap of the Neumann Laplacian, exponential concentration of Lipschitz functions, and the apriori weakest requirement that Lipschitz functions have arbitrarily slow uniform taildecay, are all quantitatively equivalent (to within universal constants, independent of the dimension). This substantially extends previous results of Maz’ya, Cheeger, Gromov– Milman, Buser and Ledoux. As an application, we conclude a sharp quantitative stability result for the spectral gap of convex domains under convex perturbations which preserve volume (up to constants) and under maps which are “onaverage ” Lipschitz. We also provide a new characterization (up to constants) of the spectral gap of a convex domain, as one over the square of the average distance from the “worst ” subset having half the measure of the domain. In addition, we easily recover and extend many previously known lower bounds on the spectral gap of convex domains, due to Payne–Weinberger, Li–Yau, Kannan– Lovász–Simonovits, Bobkov and Sodin. The proof involves estimates on the diffusion semigroup following Bakry–Ledoux and a result from Riemannian Geometry on the concavity of the isoperimetric profile. Our results extend to the more general setting of Riemannian manifolds with density which satisfy the CD(0, ∞) curvaturedimension condition of BakryÉmery. 1