Results 1  10
of
72
On the conditioning of random subdictionaries
 Appl. Comput. Harmonic Anal
"... Abstract. An important problem in the theory of sparse approximation is to identify wellconditioned subsets of vectors from a general dictionary. In most cases, current results do not apply unless the number of vectors is smaller than the square root of the ambient dimension, so these bounds are too ..."
Abstract

Cited by 100 (8 self)
 Add to MetaCart
(Show Context)
Abstract. An important problem in the theory of sparse approximation is to identify wellconditioned subsets of vectors from a general dictionary. In most cases, current results do not apply unless the number of vectors is smaller than the square root of the ambient dimension, so these bounds are too weak for many applications. This paper shatters the squareroot bottleneck by focusing on random subdictionaries instead of arbitrary subdictionaries. It provides explicit bounds on the extreme singular values of random subdictionaries that hold with overwhelming probability. The results are phrased in terms of the coherence and spectral norm of the dictionary, which capture information about its global geometry. The proofs rely on standard tools from the area of Banach space probability. As an application, the paper shows that the conditioning of a subdictionary is the major obstacle to the uniqueness of sparse representations and the success of ℓ1 minimization techniques for signal recovery. Indeed, if a fixed subdictionary is well conditioned and its cardinality is slightly smaller than the ambient dimension, then a random signal formed from this subdictionary almost surely has no other representation that is equally sparse. Moreover, with overwhelming probability, the maximally sparse representation can be identified via ℓ1 minimization. Note that the results in this paper are not directly comparable with recent work on subdictionaries of random dictionaries. 1.
Quantum Computation and Lattice Problems
 Proc. 43rd Symposium on Foundations of Computer Science
, 2002
"... We present the first explicit connection between quantum computation and lattice problems. Namely, we show a solution to the uniqueSVP under the assumption that there exists... ..."
Abstract

Cited by 73 (4 self)
 Add to MetaCart
(Show Context)
We present the first explicit connection between quantum computation and lattice problems. Namely, we show a solution to the uniqueSVP under the assumption that there exists...
Solving convex programs by random walks
 Journal of the ACM
, 2002
"... Minimizing a convex function over a convex set in ndimensional space is a basic, general problem with many interesting special cases. Here, we present a simple new algorithm for convex optimization based on sampling by a random walk. It extends naturally to minimizing quasiconvex functions and to ..."
Abstract

Cited by 68 (12 self)
 Add to MetaCart
(Show Context)
Minimizing a convex function over a convex set in ndimensional space is a basic, general problem with many interesting special cases. Here, we present a simple new algorithm for convex optimization based on sampling by a random walk. It extends naturally to minimizing quasiconvex functions and to other generalizations.
HitandRun from a Corner
"... We show that the hitandrun random walk mixes rapidly starting from any interior point of a convex body. This is the first random walk known to have this property. In contrast, the ball walk can take exponentially many steps from some starting points. ..."
Abstract

Cited by 67 (8 self)
 Add to MetaCart
(Show Context)
We show that the hitandrun random walk mixes rapidly starting from any interior point of a convex body. This is the first random walk known to have this property. In contrast, the ball walk can take exponentially many steps from some starting points.
The geometry of logconcave functions and an O∗(n³) sampling algorithm
"... The class of logconcave functions in Rn is a common generalization of Gaussians and of indicator functions of convex sets. Motivated by the problem of sampling from a logconcave density function, we study their geometry and introduce a technique for “smoothing” them out. This leads to an efficient s ..."
Abstract

Cited by 57 (17 self)
 Add to MetaCart
The class of logconcave functions in Rn is a common generalization of Gaussians and of indicator functions of convex sets. Motivated by the problem of sampling from a logconcave density function, we study their geometry and introduce a technique for “smoothing” them out. This leads to an efficient sampling algorithm (by a random walk) with no assumptions on the local smoothness of the density function. After appropriate preprocessing, the algorithm produces a point from approximately the right distribution in time O∗(n^4), and in amortized time O∗(n³) if many sample points are needed (where the asterisk indicates that dependence on the error parameter and factors of log n are not shown).
Simulatable auditing
 In PODS
, 2005
"... Given a data set consisting of private information about individuals, we consider the online query auditing problem: given a sequence of queries that have already been posed about the data, their corresponding answers – where each answer is either the true answer or “denied ” (in the event that reve ..."
Abstract

Cited by 48 (6 self)
 Add to MetaCart
(Show Context)
Given a data set consisting of private information about individuals, we consider the online query auditing problem: given a sequence of queries that have already been posed about the data, their corresponding answers – where each answer is either the true answer or “denied ” (in the event that revealing the answer compromises privacy) – and given a new query, deny the answer if privacy may be breached or give the true answer otherwise. A related problem is the offline auditing problem where one is given a sequence of queries and all of their true answers and the goal is to determine if a privacy breach has already occurred. We uncover the fundamental issue that solutions to the offline auditing problem cannot be directly used to solve the online auditing problem since query denials may leak information. Consequently, we introduce a new model called simulatable auditing where query denials provably do not leak information. We demonstrate that max queries may be audited in this simulatable paradigm under the classical definition of privacy where a breach occurs if a sensitive value is fully compromised. We also introduce a probabilistic notion of (partial) compromise. Our privacy definition requires that the apriori probability that a sensitive value lies within some small interval is not that different from the posterior probability (given the query answers). We demonstrate that sum queries can be audited in a simulatable fashion under this privacy definition.
Geometric random walks: a survey
 Combinatorial and Computational Geometry
, 2005
"... Abstract. The developing theory of geometric random walks is outlined here. Three aspects —general methods for estimating convergence (the “mixing ” rate), isoperimetric inequalities in R n and their intimate connection to random walks, and algorithms for fundamental problems (volume computation and ..."
Abstract

Cited by 42 (5 self)
 Add to MetaCart
(Show Context)
Abstract. The developing theory of geometric random walks is outlined here. Three aspects —general methods for estimating convergence (the “mixing ” rate), isoperimetric inequalities in R n and their intimate connection to random walks, and algorithms for fundamental problems (volume computation and convex optimization) that are based on sampling by random walks —are discussed. 1.
Efficient algorithms for universal portfolios
 Proceedings of the 41st Annual Symposium on the Foundations of Computer Science
, 2000
"... A constant rebalanced portfolio is an investment strategy that keeps the same distribution of wealth among a set of stocks from day to day. There has been much work on Cover's Universal algorithm, which is competitive with the best constant rebalanced portfolio determined in hindsight (3, 9, 2, ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
A constant rebalanced portfolio is an investment strategy that keeps the same distribution of wealth among a set of stocks from day to day. There has been much work on Cover's Universal algorithm, which is competitive with the best constant rebalanced portfolio determined in hindsight (3, 9, 2, 8, 16, 4, 5, 6). While this algorithm has good performance guarantees, all known implementations are exponential in the number of stocks, restricting the number of stocks used in experiments (9, 4, 2, 5, 6). We present an efficient implementation of the Universal algorithm that is based on nonuniform random walks that are rapidly mixing (1, 14, 7). This same implementation also works for nonfinancial applications of the Universal algorithm, such as data compression (6) and language modeling (11).
Fast algorithms for logconcave functions: sampling, rounding, integration and optimization
 Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science
, 2006
"... We prove that the hitandrun random walk is rapidly mixing for an arbitrary logconcave distribution starting from any point in the support. This extends the work of [26], where this was shown for an important special case, and settles the main conjecture formulated there. From this result, we deriv ..."
Abstract

Cited by 38 (12 self)
 Add to MetaCart
We prove that the hitandrun random walk is rapidly mixing for an arbitrary logconcave distribution starting from any point in the support. This extends the work of [26], where this was shown for an important special case, and settles the main conjecture formulated there. From this result, we derive asymptotically faster algorithms in the general oracle model for sampling, rounding, integration and maximization of logconcave functions, improving or generalizing the main results of [24, 25, 1] and [16] respectively. The algorithms for integration and optimization both use sampling and are surprisingly similar.