Results 1  10
of
133
Curvelets: a surprisingly effective nonadaptive representation of objects with edges
 IN CURVE AND SURFACE FITTING: SAINTMALO
, 2000
"... It is widely believed that to efficiently represent an otherwise smooth object with discontinuities along edges, one must use an adaptive representation that in some sense ‘tracks ’ the shape of the discontinuity set. This folkbelief — some would say folktheorem — is incorrect. At the very least ..."
Abstract

Cited by 290 (23 self)
 Add to MetaCart
It is widely believed that to efficiently represent an otherwise smooth object with discontinuities along edges, one must use an adaptive representation that in some sense ‘tracks ’ the shape of the discontinuity set. This folkbelief — some would say folktheorem — is incorrect. At the very least, the possible quantitative advantage of such adaptation is vastly smaller than commonly believed. We have recently constructed a tight frame of curvelets which provides stable, efficient, and nearoptimal representation of otherwise smooth objects having discontinuities along smooth curves. By applying naive thresholding to the curvelet transform of such an object, one can form mterm approximations with rate of L 2 approximation rivaling the rate obtainable by complex adaptive schemes which attempt to ‘track ’ the discontinuity set. In this article we explain the basic issues of efficient mterm approximation, the construction of efficient adaptive representation, the construction of the curvelet frame, and a crude analysis of the performance of curvelet schemes.
Sparse representation for color image restoration
 the IEEE Trans. on Image Processing
, 2007
"... Sparse representations of signals have drawn considerable interest in recent years. The assumption that natural signals, such as images, admit a sparse decomposition over a redundant dictionary leads to efficient algorithms for handling such sources of data. In particular, the design of well adapted ..."
Abstract

Cited by 111 (27 self)
 Add to MetaCart
Sparse representations of signals have drawn considerable interest in recent years. The assumption that natural signals, such as images, admit a sparse decomposition over a redundant dictionary leads to efficient algorithms for handling such sources of data. In particular, the design of well adapted dictionaries for images has been a major challenge. The KSVD has been recently proposed for this task [1], and shown to perform very well for various grayscale image processing tasks. In this paper we address the problem of learning dictionaries for color images and extend the KSVDbased grayscale image denoising algorithm that appears in [2]. This work puts forward ways for handling nonhomogeneous noise and missing information, paving the way to stateoftheart results in applications such as color image denoising, demosaicing, and inpainting, as demonstrated in this paper. EDICS Category: COLCOLR (Color processing) I.
Why simple shrinkage is still relevant for redundant representations
 IEEE Transactions on Information Theory
, 2006
"... Abstract—Shrinkage is a well known and appealing denoising technique, introduced originally by Donoho and Johnstone in 1994. The use of shrinkage for denoising is known to be optimal for Gaussian white noise, provided that the sparsity on the signal’s representation is enforced using a unitary trans ..."
Abstract

Cited by 92 (11 self)
 Add to MetaCart
Abstract—Shrinkage is a well known and appealing denoising technique, introduced originally by Donoho and Johnstone in 1994. The use of shrinkage for denoising is known to be optimal for Gaussian white noise, provided that the sparsity on the signal’s representation is enforced using a unitary transform. Still, shrinkage is also practiced with nonunitary, and even redundant representations, typically leading to very satisfactory results. In this correspondence we shed some light on this behavior. The main argument in this work is that such simple shrinkage could be interpreted as the first iteration of an algorithm that solves the basis pursuit denoising (BPDN) problem. While the desired solution of BPDN is hard to obtain in general, we develop a simple iterative procedure for the BPDN minimization that amounts to stepwise shrinkage. We demonstrate how the simple shrinkage emerges as the first iteration of this novel algorithm. Furthermore, we show how shrinkage can be iterated, turning into an effective algorithm that minimizes the BPDN via simple shrinkage steps, in order to further strengthen the denoising effect. Index Terms—Basis pursuit, denoising, frame, overcomplete, redundant, sparse representation, shrinkage, thresholding.
Platelets: A Multiscale Approach for Recovering Edges and Surfaces in PhotonLimited Medical Imaging
 IEEE TRANSACTIONS ON MEDICAL IMAGING
, 2003
"... The nonparametric multiscale platelet algorithms presented in this paper, unlike traditional waveletbased methods, are both well suited to photonlimited medical imaging applications involving Poisson data and capable of better approximating edge contours. This paper introduces platelets, localized ..."
Abstract

Cited by 79 (19 self)
 Add to MetaCart
The nonparametric multiscale platelet algorithms presented in this paper, unlike traditional waveletbased methods, are both well suited to photonlimited medical imaging applications involving Poisson data and capable of better approximating edge contours. This paper introduces platelets, localized functions at various scales, locations, and orientations that produce piecewise linear image approximations, and a new multiscale image decomposition based on these functions. Platelets are well suited for approximating images consisting of smooth regions separated by smooth boundaries. For smoothness measured in certain H older classes, it is shown that the error of mterm platelet approximations can decay significantly faster than that of mterm approximations in terms of sinusoids, wavelets, or wedgelets. This suggests that platelets may outperform existing techniques for image denoising and reconstruction. Fast, plateletbased, maximum penalized likelihood methods for photonlimited image denoising, deblurring and tomographic reconstruction problems are developed. Because platelet decompositions of Poisson distributed images are tractable and computationally efficient, existing image reconstruction methods based on expectationmaximization type algorithms can be easily enhanced with platelet techniques. Experimental results suggest that plateletbased methods can outperform standard reconstruction methods currently in use in confocal microscopy, image restoration, and emission tomography.
Backcasting: adaptive sampling for sensor networks
 In Proc. Information Processing in Sensor Networks
, 2004
"... Wireless sensor networks provide an attractive approach to spatially monitoring environments. Wireless technology makes these systems relatively flexible, but also places heavy demands on energy consumption for communications. This raises a fundamental tradeoff: using higher densities of sensors pr ..."
Abstract

Cited by 70 (4 self)
 Add to MetaCart
Wireless sensor networks provide an attractive approach to spatially monitoring environments. Wireless technology makes these systems relatively flexible, but also places heavy demands on energy consumption for communications. This raises a fundamental tradeoff: using higher densities of sensors provides more measurements, higher resolution and better accuracy, but requires more communications and processing. This paper proposes a new approach, called “backcasting, ” which can significantly reduce communications and energy consumption while maintaining high accuracy. Backcasting operates by first having a small subset of the wireless sensors communicate their information to a fusion center. This provides an initial estimate of the environment being sensed, and guides the allocation of additional network resources. Specifically, the fusion center backcasts information based on the initial estimate to the network at large, selectively activating additional sensor nodes in order to achieve a target error level. The key idea is that the initial estimate can detect correlations in the environment, indicating that many sensors may not need to be activated by the fusion center. Thus, adaptive sampling can save energy compared to dense, nonadaptive sampling. This method is theoretically analyzed in the context of field estimation and it is shown that the energy savings can be quite significant compared to conventional
Ratedistortion optimized tree structured compression algorithms for piecewise smooth images
 IEEE Trans. Image Processing
, 2005
"... IEEE Transactions on Image Processing This paper presents novel coding algorithms based on tree structured segmentation, which achieve the correct asymptotic ratedistortion (RD) behavior for a simple class of signals, known as piecewise polynomials, by using an RD based prune and join scheme. Fo ..."
Abstract

Cited by 69 (16 self)
 Add to MetaCart
IEEE Transactions on Image Processing This paper presents novel coding algorithms based on tree structured segmentation, which achieve the correct asymptotic ratedistortion (RD) behavior for a simple class of signals, known as piecewise polynomials, by using an RD based prune and join scheme. For the one dimensional (1D) case, our scheme is based on binary tree segmentation of the signal. This scheme approximates the signal segments using polynomial models and utilizes an RD optimal bit allocation strategy among the different signal segments. The scheme further encodes similar neighbors jointly to achieve the correct exponentially decaying RD behavior � D(R) ∼ c02 −c1R � , thus improving over classic wavelet schemes. We also prove that the computational complexity of the scheme is of O (N log N). We then show the extension of this scheme to the two dimensional (2D) case using a quadtree. This quadtree coding scheme also achieves an exponentially decaying RD behavior, for the polygonal image model composed of a white polygon shaped object against a uniform black background, with low computational cost of O (N log N). Again, the key is an RD optimized prune and join strategy. Finally, we conclude with numerical results, which show that the proposed quadtree coding scheme outperforms JPEG2000 by about 1 dB for real images, like cameraman, at low rates of around 0.15 bpp.
NONSUBSAMPLED CONTOURLET TRANSFORM: FILTER DESIGN AND APPLICATIONS IN DENOISING
"... In this paper we study the nonsubsampled contourlet transform. We address the corresponding filter design problem using the McClellan transformation. We show how zeroes can be imposed in the filters so that the iterated structure produces regular basis functions. The proposed design framework yields ..."
Abstract

Cited by 57 (4 self)
 Add to MetaCart
In this paper we study the nonsubsampled contourlet transform. We address the corresponding filter design problem using the McClellan transformation. We show how zeroes can be imposed in the filters so that the iterated structure produces regular basis functions. The proposed design framework yields filters that can be implemented efficiently through a lifting factorization. We apply the constructed transform in image noise removal where the results obtained are comparable to the stateofthe art, being superior in some cases.
Minimax bounds for active learning
 In COLT
, 2007
"... Abstract. This paper aims to shed light on achievable limits in active learning. Using minimax analysis techniques, we study the achievable rates of classification error convergence for broad classes of distributions characterized by decision boundary regularity and noise conditions. The results cle ..."
Abstract

Cited by 56 (5 self)
 Add to MetaCart
Abstract. This paper aims to shed light on achievable limits in active learning. Using minimax analysis techniques, we study the achievable rates of classification error convergence for broad classes of distributions characterized by decision boundary regularity and noise conditions. The results clearly indicate the conditions under which one can expect significant gains through active learning. Furthermore we show that the learning rates derived are tight for “boundary fragment ” classes in ddimensional feature spaces when the feature marginal density is bounded from above and below. 1
Recovering Edges in IllPosed Inverse Problems: Optimality of Curvelet Frames
, 2000
"... We consider a model problem of recovering a function f(x1,x2) from noisy Radon data. The function f to be recovered is assumed smooth apart from a discontinuity along a C2 curve – i.e. an edge. We use the continuum white noise model, with noise level ɛ. Traditional linear methods for solving such in ..."
Abstract

Cited by 56 (14 self)
 Add to MetaCart
We consider a model problem of recovering a function f(x1,x2) from noisy Radon data. The function f to be recovered is assumed smooth apart from a discontinuity along a C2 curve – i.e. an edge. We use the continuum white noise model, with noise level ɛ. Traditional linear methods for solving such inverse problems behave poorly in the presence of edges. Qualitatively, the reconstructions are blurred near the edges; quantitatively, they give in our model Mean Squared Errors (MSEs) that tend to zero with noise level ɛ only as O(ɛ1/2)asɛ → 0. A recent innovation – nonlinear shrinkage in the wavelet domain – visually improves edge sharpness and improves MSE convergence to O(ɛ2/3). However, as we show here, this rate is not optimal. In fact, essentially optimal performance is obtained by deploying the recentlyintroduced tight frames of curvelets in this setting. Curvelets are smooth, highly anisotropic elements ideally suited for detecting and synthesizing curved edges. To deploy them in the Radon setting, we construct a curveletbased biorthogonal decomposition
Beamlets and Multiscale Image Analysis
 in Multiscale and Multiresolution Methods
, 2001
"... We describe a framework for multiscale image analysis in which line segments play a role analogous to the role played by points in wavelet analysis. ..."
Abstract

Cited by 55 (16 self)
 Add to MetaCart
We describe a framework for multiscale image analysis in which line segments play a role analogous to the role played by points in wavelet analysis.