Results 11  20
of
1,649
Convex Relaxations of Bregman Divergence Clustering
"... Although many convex relaxations of clustering have been proposed in the past decade, current formulations remain restricted to spherical Gaussian or discriminative models and are susceptible to imbalanced clusters. To address these shortcomings, we propose a new class of convex relaxations that can ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Although many convex relaxations of clustering have been proposed in the past decade, current formulations remain restricted to spherical Gaussian or discriminative models and are susceptible to imbalanced clusters. To address these shortcomings, we propose a new class of convex relaxations
Comparison of convex relaxations of quadrilinear terms
"... In this paper we compare four different ways to compute a convex linear relaxation of a quadrilinear monomial on a box, analyzing their relative tightness. We computationally compare the quality of the relaxations, and we provide a general theorem on pairwisecomparison of relaxation strength, which ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In this paper we compare four different ways to compute a convex linear relaxation of a quadrilinear monomial on a box, analyzing their relative tightness. We computationally compare the quality of the relaxations, and we provide a general theorem on pairwisecomparison of relaxation strength
Analysis of multistage convex relaxation for sparse regularization
 Journal of Machine Learning Research
"... We consider learning formulations with nonconvex objective functions that often occur in practical applications. There are two approaches to this problem: • Heuristic methods such as gradient descent that only find a local minimum. A drawback of this approach is the lack of theoretical guarantee sh ..."
Abstract

Cited by 62 (7 self)
 Add to MetaCart
showing that the local minimum gives a good solution. • Convex relaxation such as L1regularization that solves the problem under some conditions. However it often leads to a suboptimal solution in reality. This paper tries to remedy the above gap between theory and practice. In particular, we present a
A. Convex Relaxation for the Laplace Prior
"... In Section 2.2.2 of the paper, the Laplace distribution proportion prior energy Ep(ri) = µ σi ri − r̄i  = µ σi ∣∣ ∣ ai 1 − an − r̄i ∣∣ ∣ (1) is introduced and the following is stated: Proposition 1. The convex relaxation of (1) on the domain ai, an ≥ 0 and ai + an ≤ 1 is given by E1(ai, an):= µ σ ..."
Abstract
 Add to MetaCart
In Section 2.2.2 of the paper, the Laplace distribution proportion prior energy Ep(ri) = µ σi ri − r̄i  = µ σi ∣∣ ∣ ai 1 − an − r̄i ∣∣ ∣ (1) is introduced and the following is stated: Proposition 1. The convex relaxation of (1) on the domain ai, an ≥ 0 and ai + an ≤ 1 is given by E1(ai, an):= µ
Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements
 CISS 2006 (40th Annual Conference on Information Sciences and Systems
, 2006
"... Abstract — This paper proves best known guarantees for exact reconstruction of a sparse signal f from few nonadaptive universal linear measurements. We consider Fourier measurements (random sample of frequencies of f) and random Gaussian measurements. The method for reconstruction that has recently ..."
Abstract

Cited by 108 (7 self)
 Add to MetaCart
recently gained momentum in the Sparse Approximation Theory is to relax this highly nonconvex problem to a convex problem, and then solve it as a linear program. What are best guarantees for the reconstruction problem to be equivalent to its convex relaxation is an open question. Recent work shows
Entropy Minimization for Convex Relaxation Approaches
"... Despite their enormous success in solving hard combinatorial problems, convex relaxation approaches often suffer from the fact that the computed solutions are far from binary and that subsequent heuristic binarization may substantially degrade the quality of computed solutions. In this paper, we ..."
Abstract
 Add to MetaCart
Despite their enormous success in solving hard combinatorial problems, convex relaxation approaches often suffer from the fact that the computed solutions are far from binary and that subsequent heuristic binarization may substantially degrade the quality of computed solutions. In this paper, we
An Analysis of Convex Relaxations for MAP Estimation
"... The problem of obtaining the maximum a posteriori estimate of a general discrete random field (i.e. a random field defined using a finite and discrete set of labels) is known to be NPhard. However, due to its central importance in many applications, several approximate algorithms have been proposed ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
proposed in the literature. In this paper, we present an analysis of three such algorithms based on convex relaxations: (i) LPS: the linear programming (LP) relaxation proposed by Schlesinger [25] for a special case and independently in [4, 17, 31] for the general case; (ii) QPRL: the quadratic
Computational and Statistical Tradeoffs via Convex Relaxation
, 2012
"... In modern data analysis, one is frequently faced with statistical inference problems involving massive datasets. Processing such large datasets is usually viewed as a substantial computational challenge. However, if data are a statistician’s main resource then access to more data should be viewed as ..."
Abstract

Cited by 45 (1 self)
 Add to MetaCart
as an asset rather than as a burden. In this paper we describe a computational framework based on convex relaxation to reduce the computational complexity of an inference procedure when one has access to increasingly larger datasets. Convex relaxation techniques have been widely used in theoretical computer
Complexity Analyses of Discretized Successive Convex Relaxation Methods
, 1999
"... We investigate the computational complexity of discretized successive convex relaxation methods in the way of upper bounding the number of major iterations required, in the worst case. Kojima and Takeda [2] earlier analyzed the computational complexity of semiinfinite successive convex relaxation m ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We investigate the computational complexity of discretized successive convex relaxation methods in the way of upper bounding the number of major iterations required, in the worst case. Kojima and Takeda [2] earlier analyzed the computational complexity of semiinfinite successive convex relaxation
Multistage convex relaxation for learning with sparse regularization
 In: Proceedings of the 22nd Annual Conference on Neural Information Processing Systems. 8–13 December, 2008; Vancouver, British
, 2008
"... We study learning formulations with nonconvex regularizaton that are natural for sparse linear models. There are two approaches to this problem: • Heuristic methods such as gradient descent that only find a local minimum. A drawback of this approach is the lack of theoretical guarantee showing that ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
that the local minimum gives a good solution. • Convex relaxation such as L1regularization that solves the problem under some conditions. However it often leads to suboptimal sparsity in reality. This paper tries to remedy the above gap between theory and practice. In particular, we investigate a multi
Results 11  20
of
1,649