Results 11  20
of
93,140
Convex Relaxations of Bregman Divergence Clustering
"... Although many convex relaxations of clustering have been proposed in the past decade, current formulations remain restricted to spherical Gaussian or discriminative models and are susceptible to imbalanced clusters. To address these shortcomings, we propose a new class of convex relaxations that can ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Although many convex relaxations of clustering have been proposed in the past decade, current formulations remain restricted to spherical Gaussian or discriminative models and are susceptible to imbalanced clusters. To address these shortcomings, we propose a new class of convex relaxations
Comparison of convex relaxations of quadrilinear terms
"... In this paper we compare four different ways to compute a convex linear relaxation of a quadrilinear monomial on a box, analyzing their relative tightness. We computationally compare the quality of the relaxations, and we provide a general theorem on pairwisecomparison of relaxation strength, which ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
In this paper we compare four different ways to compute a convex linear relaxation of a quadrilinear monomial on a box, analyzing their relative tightness. We computationally compare the quality of the relaxations, and we provide a general theorem on pairwisecomparison of relaxation strength
Analysis of multistage convex relaxation for sparse regularization
 Journal of Machine Learning Research
"... We consider learning formulations with nonconvex objective functions that often occur in practical applications. There are two approaches to this problem: • Heuristic methods such as gradient descent that only find a local minimum. A drawback of this approach is the lack of theoretical guarantee sh ..."
Abstract

Cited by 60 (7 self)
 Add to MetaCart
showing that the local minimum gives a good solution. • Convex relaxation such as L1regularization that solves the problem under some conditions. However it often leads to a suboptimal solution in reality. This paper tries to remedy the above gap between theory and practice. In particular, we present a
A. Convex Relaxation for the Laplace Prior
"... In Section 2.2.2 of the paper, the Laplace distribution proportion prior energy Ep(ri) = µ σi ri − r̄i  = µ σi ∣∣ ∣ ai 1 − an − r̄i ∣∣ ∣ (1) is introduced and the following is stated: Proposition 1. The convex relaxation of (1) on the domain ai, an ≥ 0 and ai + an ≤ 1 is given by E1(ai, an):= µ σ ..."
Abstract
 Add to MetaCart
In Section 2.2.2 of the paper, the Laplace distribution proportion prior energy Ep(ri) = µ σi ri − r̄i  = µ σi ∣∣ ∣ ai 1 − an − r̄i ∣∣ ∣ (1) is introduced and the following is stated: Proposition 1. The convex relaxation of (1) on the domain ai, an ≥ 0 and ai + an ≤ 1 is given by E1(ai, an):= µ
Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements
 CISS 2006 (40th Annual Conference on Information Sciences and Systems
, 2006
"... Abstract — This paper proves best known guarantees for exact reconstruction of a sparse signal f from few nonadaptive universal linear measurements. We consider Fourier measurements (random sample of frequencies of f) and random Gaussian measurements. The method for reconstruction that has recently ..."
Abstract

Cited by 116 (8 self)
 Add to MetaCart
recently gained momentum in the Sparse Approximation Theory is to relax this highly nonconvex problem to a convex problem, and then solve it as a linear program. What are best guarantees for the reconstruction problem to be equivalent to its convex relaxation is an open question. Recent work shows
New Convex Relaxations for the Maximum Cut . . .
, 2001
"... It is well known that many of the optimization problems which arise in applications are “hard”, which usually means that they are NPhard. Hence much research has been devoted to finding “good” relaxations for these hard problems. Usually a “good” relaxation is one which can be solved (either exac ..."
Abstract
 Add to MetaCart
exactly or within a prescribed numerical tolerance) in polynomialtime. Nesterov and Nemirovskii showed that by this criterion, many convex optimization problems are good relaxations. This thesis presents new convex relaxations for two such hard problems, namely the MaximumCut (MaxCut) problem
Entropy Minimization for Convex Relaxation Approaches
"... Despite their enormous success in solving hard combinatorial problems, convex relaxation approaches often suffer from the fact that the computed solutions are far from binary and that subsequent heuristic binarization may substantially degrade the quality of computed solutions. In this paper, we ..."
Abstract
 Add to MetaCart
Despite their enormous success in solving hard combinatorial problems, convex relaxation approaches often suffer from the fact that the computed solutions are far from binary and that subsequent heuristic binarization may substantially degrade the quality of computed solutions. In this paper, we
An Analysis of Convex Relaxations for MAP Estimation
"... The problem of obtaining the maximum a posteriori estimate of a general discrete random field (i.e. a random field defined using a finite and discrete set of labels) is known to be NPhard. However, due to its central importance in many applications, several approximate algorithms have been proposed ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
proposed in the literature. In this paper, we present an analysis of three such algorithms based on convex relaxations: (i) LPS: the linear programming (LP) relaxation proposed by Schlesinger [25] for a special case and independently in [4, 17, 31] for the general case; (ii) QPRL: the quadratic
Computational and Statistical Tradeoffs via Convex Relaxation
, 2012
"... In modern data analysis, one is frequently faced with statistical inference problems involving massive datasets. Processing such large datasets is usually viewed as a substantial computational challenge. However, if data are a statistician’s main resource then access to more data should be viewed as ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
as an asset rather than as a burden. In this paper we describe a computational framework based on convex relaxation to reduce the computational complexity of an inference procedure when one has access to increasingly larger datasets. Convex relaxation techniques have been widely used in theoretical computer
Just Relax: Convex Programming Methods for Identifying Sparse Signals in Noise
, 2006
"... This paper studies a difficult and fundamental problem that arises throughout electrical engineering, applied mathematics, and statistics. Suppose that one forms a short linear combination of elementary signals drawn from a large, fixed collection. Given an observation of the linear combination that ..."
Abstract

Cited by 496 (2 self)
 Add to MetaCart
. This paper studies a method called convex relaxation, which attempts to recover the ideal sparse signal by solving a convex program. This approach is powerful because the optimization can be completed in polynomial time with standard scientific software. The paper provides general conditions which ensure
Results 11  20
of
93,140