Results 1 
4 of
4
Sparse Signal Estimation by Maximally Sparse Convex Optimization
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2014
"... This paper addresses the problem of sparsity penalized least squares for applications in sparse signal processing, e.g. sparse deconvolution. This paper aims to induce sparsity more strongly than L1 norm regularization, while avoiding nonconvex optimization. For this purpose, this paper describes ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
This paper addresses the problem of sparsity penalized least squares for applications in sparse signal processing, e.g. sparse deconvolution. This paper aims to induce sparsity more strongly than L1 norm regularization, while avoiding nonconvex optimization. For this purpose, this paper describes the design and use of nonconvex penalty functions (regularizers) constrained so as to ensure the convexity of the total cost function, F, to be minimized. The method is based on parametric penalty functions, the parameters of which are constrained to ensure convexity of F. It is shown that optimal parameters can be obtained by semidefinite programming (SDP). This maximally sparse convex (MSC) approach yields maximally nonconvex sparsityinducing penalty functions constrained such that the total cost function, F, is convex. It is demonstrated that iterative MSC (IMSC) can yield solutions substantially more sparse than the standard convex sparsityinducing approach, i.e., L1 norm minimization.
GroupSparse Signal Denoising: NonConvex Regularization, Convex Optimization
, 2014
"... Abstract—Convex optimization with sparsitypromoting convex regularization is a standard approach for estimating sparse signals in noise. In order to promote sparsity more strongly than convex regularization, it is also standard practice to employ nonconvex optimization. In this paper, we take a t ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Convex optimization with sparsitypromoting convex regularization is a standard approach for estimating sparse signals in noise. In order to promote sparsity more strongly than convex regularization, it is also standard practice to employ nonconvex optimization. In this paper, we take a third approach. We utilize a nonconvex regularization term chosen such that the total cost function (consisting of data consistency and regularization terms) is convex. Therefore, sparsity is more strongly promoted than in the standard convex formulation, but without sacrificing the attractive aspects of convex optimization (unique minimum, robust algorithms, etc.). We use this idea to improve the recently developed ‘overlapping group shrinkage ’ (OGS) algorithm for the denoising of groupsparse signals. The algorithm is applied to the problem of speech enhancement with favorable results in terms of both SNR and perceptual quality. Index Terms—group sparse model; convex optimization; nonconvex optimization; sparse optimization; translationinvariant denoising; denoising; speech enhancement I.
Convex 1D total variation denoising with nonconvex regularization
 IEEE Signal Processing Letters
, 2015
"... Abstract—Total variation (TV) denoising is an effective noise suppression method when the derivative of the underlying signal is known to be sparse. TV denoising is defined in terms of a convex optimization problem involving a quadratic data fidelity term and a convex regularization term. A nonconv ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Total variation (TV) denoising is an effective noise suppression method when the derivative of the underlying signal is known to be sparse. TV denoising is defined in terms of a convex optimization problem involving a quadratic data fidelity term and a convex regularization term. A nonconvex regularizer can promote sparsity more strongly, but generally leads to a nonconvex optimization problem with nonoptimal local minima. This letter proposes the use of a nonconvex regularizer constrained so that the total objective function to be minimized maintains its convexity. Conditions for a nonconvex regularizer are given that ensure the total TV denoising objective function is convex. An efficient algorithm is given for the resulting problem. I.
Simultaneous polynomial approximation and total variation denoising
 In Proc. IEEE Int. Conf. Acoust., Speech, Signal Processing (ICASSP
, 2013
"... This paper addresses the problem of smoothing data with additive step discontinuities. The problem formulation is based on least square polynomial approximation and total variation denoising. In earlier work, an ADMM algorithm was proposed to minimize a suitably defined sparsitypromoting cost func ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper addresses the problem of smoothing data with additive step discontinuities. The problem formulation is based on least square polynomial approximation and total variation denoising. In earlier work, an ADMM algorithm was proposed to minimize a suitably defined sparsitypromoting cost function. In this paper, an algorithm is derived using the majorizationminimization optimization procedure. The new algorithm converges faster and, unlike the ADMM algorithm, has no parameters that need to be set. The proposed algorithm is formulated so as to utilize fast solvers for banded systems for high computational efficiency. This paper also gives optimality conditions so that the optimality of a result produced by the numerical algorithm can be readily validated. 1.