Results 1 
8 of
8
GroupSparse Signal Denoising: NonConvex Regularization, Convex Optimization
, 2014
"... Abstract—Convex optimization with sparsitypromoting convex regularization is a standard approach for estimating sparse signals in noise. In order to promote sparsity more strongly than convex regularization, it is also standard practice to employ nonconvex optimization. In this paper, we take a t ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Convex optimization with sparsitypromoting convex regularization is a standard approach for estimating sparse signals in noise. In order to promote sparsity more strongly than convex regularization, it is also standard practice to employ nonconvex optimization. In this paper, we take a third approach. We utilize a nonconvex regularization term chosen such that the total cost function (consisting of data consistency and regularization terms) is convex. Therefore, sparsity is more strongly promoted than in the standard convex formulation, but without sacrificing the attractive aspects of convex optimization (unique minimum, robust algorithms, etc.). We use this idea to improve the recently developed ‘overlapping group shrinkage ’ (OGS) algorithm for the denoising of groupsparse signals. The algorithm is applied to the problem of speech enhancement with favorable results in terms of both SNR and perceptual quality. Index Terms—group sparse model; convex optimization; nonconvex optimization; sparse optimization; translationinvariant denoising; denoising; speech enhancement I.
1 A PrimalDual Proximal Algorithm for Sparse TemplateBased Adaptive Filtering: Application to Seismic Multiple Removal
"... Abstract—Unveiling meaningful geophysical information from seismic data requires to deal with both random and structured “noises”. As their amplitude may be greater than signals of interest (primaries), additional prior information is especially important in performing efficient signal separation. W ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Unveiling meaningful geophysical information from seismic data requires to deal with both random and structured “noises”. As their amplitude may be greater than signals of interest (primaries), additional prior information is especially important in performing efficient signal separation. We address here the problem of multiple reflections, caused by wavefield bouncing between layers. Since only approximate models of these phenomena are available, we propose a flexible framework for timevarying adaptive filtering of seismic signals, using sparse representations, based on inaccurate templates. We recast the joint estimation of adaptive filters and primaries in a new convex variational formulation. This approach allows us to incorporate plausible knowledge about noise statistics, data sparsity and slow filter variation in parsimonypromoting wavelet frames. The designed primaldual algorithm solves a constrained minimization problem that alleviates standard regularization issues in finding hyperparameters. The approach demonstrates significantly good performance in low signaltonoise ratio conditions, both for simulated and real field seismic data. Index Terms—Convex optimization, Parallel algorithms, Wavelet transforms, Adaptive filters, Geophysical signal processing, Signal restoration, Sparsity, Signal separation.
Convex 1D total variation denoising with nonconvex regularization
 IEEE Signal Processing Letters
, 2015
"... Abstract—Total variation (TV) denoising is an effective noise suppression method when the derivative of the underlying signal is known to be sparse. TV denoising is defined in terms of a convex optimization problem involving a quadratic data fidelity term and a convex regularization term. A nonconv ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Total variation (TV) denoising is an effective noise suppression method when the derivative of the underlying signal is known to be sparse. TV denoising is defined in terms of a convex optimization problem involving a quadratic data fidelity term and a convex regularization term. A nonconvex regularizer can promote sparsity more strongly, but generally leads to a nonconvex optimization problem with nonoptimal local minima. This letter proposes the use of a nonconvex regularizer constrained so that the total objective function to be minimized maintains its convexity. Conditions for a nonconvex regularizer are given that ensure the total TV denoising objective function is convex. An efficient algorithm is given for the resulting problem. I.
Transient Artifact Reduction Algorithm (TARA) Based on Sparse Optimization
"... Abstract—This paper addresses the suppression of transient artifacts in signals, e.g., biomedical time series. To that end, we distinguish two types of artifact signals. We define “Type 1 ” artifacts as spikes and sharp, brief waves that adhere to a baseline value of zero. We define “Type 2 ” artif ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—This paper addresses the suppression of transient artifacts in signals, e.g., biomedical time series. To that end, we distinguish two types of artifact signals. We define “Type 1 ” artifacts as spikes and sharp, brief waves that adhere to a baseline value of zero. We define “Type 2 ” artifacts as comprising approximate step discontinuities. We model a Type 1 artifact as being sparse and having a sparse timederivative, and a Type 2 artifact as having a sparse timederivative. We model the observed time series as the sum of a lowpass signal (e.g., a background trend), an artifact signal of each type, and a white Gaussian stochastic process. To jointly estimate the components of the signal model, we formulate a sparse optimization problem and develop a rapidly converging, computationally efficient iterative algorithm denoted TARA (“transient artifact reduction algorithm”). The effectiveness of the approach is illustrated using near infrared spectroscopic timeseries data. Index Terms—Measurement artifact, artifact rejection, sparse optimization, wavelet, lowpass filter, total variation, lasso, fused lasso. I.
Artifactfree Wavelet Denoising: Nonconvex Sparse Regularization, Convex Optimization
"... Abstract—Algorithms for signal denoising that combine waveletdomain sparsity and total variation (TV) regularization are relatively free of artifacts, such as pseudoGibbs oscillations, normally introduced by pure wavelet thresholding. This paper formulates waveletTV (WATV) denoising as a unified ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—Algorithms for signal denoising that combine waveletdomain sparsity and total variation (TV) regularization are relatively free of artifacts, such as pseudoGibbs oscillations, normally introduced by pure wavelet thresholding. This paper formulates waveletTV (WATV) denoising as a unified problem. To strongly induce wavelet sparsity, the proposed approach uses nonconvex penalty functions. At the same time, in order to draw on the advantages of convex optimization (unique minimum, reliable algorithms, simplified regularization parameter selection), the nonconvex penalties are chosen so as to ensure the convexity of the total objective function. A computationally efficient, fast converging algorithm is derived. I.
ℓ2ℓ0 regularization path tracking algorithms
"... Sparse signal approximation can be formulated as the mixed ℓ2ℓ0 minimization problem minx J (x;λ) = ‖y−Ax‖22+λ‖x‖0. We propose two heuristic search algorithms to minimize J for a continuum of λvalues, yielding a sequence of coarse to fine approximations. Continuation Single Best Replacement is a ..."
Abstract
 Add to MetaCart
(Show Context)
Sparse signal approximation can be formulated as the mixed ℓ2ℓ0 minimization problem minx J (x;λ) = ‖y−Ax‖22+λ‖x‖0. We propose two heuristic search algorithms to minimize J for a continuum of λvalues, yielding a sequence of coarse to fine approximations. Continuation Single Best Replacement is a bidirectional greedy algorithm adapted from the Single Best Replacement algorithm previously proposed for minimizing J for fixed λ. ℓ0 regularization path track is a more complex algorithm exploiting that the ℓ2ℓ0 regularization path is piecewise constant with respect to λ. Tracking the ℓ0 regularization path is done in a suboptimal manner by maintaining (i) a list of subsets that are candidates to be solution supports for decreasing λ’s and (ii) the list of critical λvalues around which the solution changes. Both algorithms gradually construct the ℓ0 regularization path by performing single replacements, i.e., adding or removing a dictionary atom from a subset. A straightforward adaptation of these algorithms yields suboptimal solutions to minx ‖y −Ax‖22 subject to ‖x‖0 ≤ k for contiguous values of k ≥ 0 and to minx ‖x‖0 subject to ‖y −Ax‖22 ≤ ε for continuous values of ε. Numerical simulations show the effectiveness of the algorithms on a difficult sparse deconvolution problem inducing a highly correlated dictionary A.