Results 1  10
of
66
Random Walks And An O*(n 5 ) Volume Algorithm For Convex Bodies
, 1996
"... Given a high dimensional convex body K ` IR n by a separation oracle, we can approximate its volume with relative error ", using O (n 5 ) oracle calls. Our algorithm also brings the body into isotropic position. As all previous randomized volume algorithms, we use "rounding" followed by a mul ..."
Abstract

Cited by 75 (8 self)
 Add to MetaCart
Given a high dimensional convex body K ` IR n by a separation oracle, we can approximate its volume with relative error ", using O (n 5 ) oracle calls. Our algorithm also brings the body into isotropic position. As all previous randomized volume algorithms, we use "rounding" followed by a multiphase MonteCarlo (product estimator) technique. Both parts rely on sampling (generating random points in K), which is done by random walk. Our algorithm introduces three new ideas: ffl the use of the isotropic position (or at least an approximation of it) for rounding, ffl the separation of global obstructions (diameter) and local obstructions (boundary problems) for fast mixing, and ffl a stepwise interlacing of rounding and sampling. 1 . Introduction For a variety of geometric objects, classical results characterize various geometric parameters. Many of these results are useful even in practical situations: they can easily be transformed into efficient algorithms. Some other theorem...
The BrunnMinkowski inequality
 Bull. Amer. Math. Soc. (N.S
, 2002
"... Abstract. In 1978, Osserman [124] wrote an extensive survey on the isoperimetric inequality. The BrunnMinkowski inequality can be proved in a page, yet quickly yields the classical isoperimetric inequality for important classes of subsets of R n, and deserves to be better known. This guide explains ..."
Abstract

Cited by 74 (5 self)
 Add to MetaCart
Abstract. In 1978, Osserman [124] wrote an extensive survey on the isoperimetric inequality. The BrunnMinkowski inequality can be proved in a page, yet quickly yields the classical isoperimetric inequality for important classes of subsets of R n, and deserves to be better known. This guide explains the relationship between the BrunnMinkowski inequality and other inequalities in geometry and analysis, and some applications. 1.
An Improved WorstCase to AverageCase Connection for Lattice Problems (extended abstract)
 In FOCS
, 1997
"... We improve a connection of the worstcase complexity and the averagecase complexity of some wellknown lattice problems. This fascinating connection was first discovered by Ajtai [1] in 1996. We improve the exponent of this connection from 8 to 3:5 + ffl. Department of Computer Science, State Unive ..."
Abstract

Cited by 54 (10 self)
 Add to MetaCart
We improve a connection of the worstcase complexity and the averagecase complexity of some wellknown lattice problems. This fascinating connection was first discovered by Ajtai [1] in 1996. We improve the exponent of this connection from 8 to 3:5 + ffl. Department of Computer Science, State University of New York at Buffalo, Buffalo, NY 14260. Research supported in part by NSF grants CCR9319393 and CCR9634665, and an Alfred P. Sloan Fellowship. Email: cai@cs.buffalo.edu y Department of Computer Science, State University of New York at Buffalo, Buffalo, NY 14260. Research supported in part by NSF grants CCR9319393 and CCR9634665. Email: apn@cs.buffalo.edu 1 Introduction A lattice L is a discrete additive subgroup of R n . There are many fascinating problems concerning lattices, both from a structural and from an algorithmic point of view [12, 20, 11, 13]. The study of lattice problems can be traced back to Gauss, Dirichlet and Hermite, among others [8, 6, 14]. The subje...
Solving convex programs by random walks
 Journal of the ACM
, 2002
"... Minimizing a convex function over a convex set in ndimensional space is a basic, general problem with many interesting special cases. Here, we present a simple new algorithm for convex optimization based on sampling by a random walk. It extends naturally to minimizing quasiconvex functions and to ..."
Abstract

Cited by 50 (12 self)
 Add to MetaCart
Minimizing a convex function over a convex set in ndimensional space is a basic, general problem with many interesting special cases. Here, we present a simple new algorithm for convex optimization based on sampling by a random walk. It extends naturally to minimizing quasiconvex functions and to other generalizations.
Faster Mixing via average Conductance
"... The notion of conductance introduced by Jerrum and Sinclair [JS] has been widely used to prove rapid mixing of Markov chains. Here we introduce a variant of this  instead of measuring the conductance of the worst subset of states, we show that it is enough to bound a certain weighted average conduc ..."
Abstract

Cited by 44 (3 self)
 Add to MetaCart
The notion of conductance introduced by Jerrum and Sinclair [JS] has been widely used to prove rapid mixing of Markov chains. Here we introduce a variant of this  instead of measuring the conductance of the worst subset of states, we show that it is enough to bound a certain weighted average conductance (where the average is taken over subsets of states with different sizes.) In the case of convex bodies, we show that this average conductance is better than the known bounds for the worst case; this helps us save a factor of O(n) which is incurred in all proofs as a "penalty" for a "bad start" (i.e., because the starting distribution may be arbitrary).
A Spectral Algorithm for Learning Mixtures of Distributions
 Journal of Computer and System Sciences
, 2002
"... We show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in R works remarkably well  it succeeds in identifying the Gaussians assuming essentially the minimum possible separation between their centers that keeps them unique (solving an open problem of [1]). The ..."
Abstract

Cited by 43 (5 self)
 Add to MetaCart
We show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in R works remarkably well  it succeeds in identifying the Gaussians assuming essentially the minimum possible separation between their centers that keeps them unique (solving an open problem of [1]). The sample complexity and running time are polynomial in both n and k. The algorithm also works for the more general problem of learning a mixture of "weakly isotropic" distributions (e.g. a mixture of uniform distributions on cubes).
HitandRun from a Corner
"... We show that the hitandrun random walk mixes rapidly starting from any interior point of a convex body. This is the first random walk known to have this property. In contrast, the ball walk can take exponentially many steps from some starting points. ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
We show that the hitandrun random walk mixes rapidly starting from any interior point of a convex body. This is the first random walk known to have this property. In contrast, the ball walk can take exponentially many steps from some starting points.
HitandRun Mixes Fast
 Math. Prog
, 1998
"... It is shown that the "hitandrun" algorithm for sampling from a convex body K (introduced by R.L. Smith) mixes in time O # (n 2 R 2 /r 2 ), where R and r are the radii of the inscribed and circumscribed balls of K. Thus after appropriate preprocessing, hitandrun produces an approximately un ..."
Abstract

Cited by 36 (7 self)
 Add to MetaCart
It is shown that the "hitandrun" algorithm for sampling from a convex body K (introduced by R.L. Smith) mixes in time O # (n 2 R 2 /r 2 ), where R and r are the radii of the inscribed and circumscribed balls of K. Thus after appropriate preprocessing, hitandrun produces an approximately uniformly distributed sample point in time O # (n 3 ), which matches the best known bound for other sampling algorithms. We show that the bound is best possible in terms of R, r and n. 1 Introduction There are many computational tasks that require sampling from a convex body K in a highdimensional space R n (i.e., generating an approximately uniformly distributed random point in K). The generic method to do so is to define an ergodic random walk on the points of K whose stationary distribution is uniform, and follow this random walk for an appropriately large number of steps; the point obtained this way will be approximately stationary, i.e., approximately uniform. The crucial issue is t...
A spectral algorithm for learning mixture models
 J. Comput. Syst. Sci
, 2004
"... Abstract We show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in R n works remarkably well it succeeds in identifying the Gaussians assuming essentially the minimum possible separation between their centers that keeps them unique (solving an open problem of [1]) ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
Abstract We show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in R n works remarkably well it succeeds in identifying the Gaussians assuming essentially the minimum possible separation between their centers that keeps them unique (solving an open problem of [1]). The sample complexity and running time are polynomial in both n and k. The algorithm can be applied to the more general problem of learning a mixture of "weakly isotropic " distributions (e.g. a mixture of uniform distributions on cubes). 1 Introduction Learning a mixture of distributions is a classical problem in statistics and learning theory (see [10, 14]); more recently, it has also been proposed as a model for clustering. In the basic version of the problem we are given random samples from a mixture of k distributions, F1; : : : ; Fk. Each sample is drawn independently with probability wi from the i'th distribution. The numbers w