Results 1  10
of
49
Sampling—50 years after Shannon
 Proceedings of the IEEE
, 2000
"... This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the math ..."
Abstract

Cited by 207 (22 self)
 Add to MetaCart
This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the mathematical connections that were made with wavelet theory. To introduce the reader to the modern, Hilbertspace formulation, we reinterpret Shannon’s sampling procedure as an orthogonal projection onto the subspace of bandlimited functions. We then extend the standard sampling paradigm for a representation of functions in the more general class of “shiftinvariant” functions spaces, including splines and wavelets. Practically, this allows for simpler—and possibly more realistic—interpolation models, which can be used in conjunction with a much wider class of (antialiasing) prefilters that are not necessarily ideal lowpass. We summarize and discuss the results available for the determination of the approximation error and of the sampling rate when the input of the system is essentially arbitrary; e.g., nonbandlimited. We also review variations of sampling that can be understood from the same unifying perspective. These include wavelets, multiwavelets, Papoulis generalized sampling, finite elements, and frames. Irregular sampling and radial basis functions are briefly mentioned. Keywords—Bandlimited functions, Hilbert spaces, interpolation, least squares approximation, projection operators, sampling,
A chronology of interpolation: From ancient astronomy to modern signal and image processing
 Proceedings of the IEEE
, 2002
"... This paper presents a chronological overview of the developments in interpolation theory, from the earliest times to the present date. It brings out the connections between the results obtained in different ages, thereby putting the techniques currently used in signal and image processing into histo ..."
Abstract

Cited by 61 (0 self)
 Add to MetaCart
This paper presents a chronological overview of the developments in interpolation theory, from the earliest times to the present date. It brings out the connections between the results obtained in different ages, thereby putting the techniques currently used in signal and image processing into historical perspective. A summary of the insights and recommendations that follow from relatively recent theoretical as well as experimental studies concludes the presentation. Keywords—Approximation, convolutionbased interpolation, history, image processing, polynomial interpolation, signal processing, splines. “It is an extremely useful thing to have knowledge of the true origins of memorable discoveries, especially those that have been found not by accident but by dint of meditation. It is not so much that thereby history may attribute to each man his own discoveries and others should be encouraged to earn like commendation, as that the art of making discoveries should be extended by considering noteworthy examples of it. ” 1 I.
Quantitative evaluation of convolutionbased methods for medical image interpolation
 Medical Image Analysis
, 2001
"... Abstract—Interpolation is required in a variety of medical image processing applications. Although many interpolation techniques are known from the literature, evaluations of these techniques for the specific task of applying geometrical transformations to medical images are still lacking. In this p ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
Abstract—Interpolation is required in a variety of medical image processing applications. Although many interpolation techniques are known from the literature, evaluations of these techniques for the specific task of applying geometrical transformations to medical images are still lacking. In this paper we present such an evaluation. We consider convolutionbased interpolation methods and rigid transformations (rotations and translations). A large number of sincapproximating kernels are evaluated, including piecewise polynomial kernels and a large number of windowed sinc kernels, with spatial supports ranging from two to ten grid intervals. In the evaluation we use images from a wide variety of medical image modalities. The results show that spline interpolation is to be preferred over all other methods, both for its accuracy and its relatively low computational cost. Keywords—Convolutionbased interpolation, spline interpolation, piecewise polynomial kernels, windowed sinc kernels, geometrical transformation, medical images, quantitative evaluation. 1
Minimum rate sampling and reconstruction of signals with arbitrary frequency support
 IEEE Trans. Inform. Theory
, 1999
"... Abstract—We examine the question of reconstruction of signals from periodic nonuniform samples. This involves discarding samples from a uniformly sampled signal in some periodic fashion. We give a characterization of the signals that can be reconstructed at exactly the minimum rate once a nonuniform ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
Abstract—We examine the question of reconstruction of signals from periodic nonuniform samples. This involves discarding samples from a uniformly sampled signal in some periodic fashion. We give a characterization of the signals that can be reconstructed at exactly the minimum rate once a nonuniform sampling pattern has been fixed. We give an implicit characterization of the reconstruction system, and a design method by which the ideal reconstruction filters may be approximated. We demonstrate that for certain spectral supports the minimum rate can be approached or achieved using reconstruction schemes of much lower complexity than those arrived at by using spectral slicing, as in earlier work. Previous work on multiband signals have typically been those for which restrictive assumptions on the sizes and positions of the bands have been made, or where the minimum rate was approached asymptotically. We show that the class of multiband signals which can be reconstructed exactly is shown to be far larger than previously considered. When approaching the minimum rate, this freedom allows us, in certain cases to have a far less complex reconstruction system. Index Terms — Multiband, nonuniform, reconstruction, sampling. I.
Quadratic Interpolation for Image Resampling
, 1997
"... Nearestneighbour, linear, and various cubic interpolation functions are frequently used in image resampling. Quadratic functions have been disregarded, largely because they have been thought to introduce phase distortions. This is shown not to be the case, and a family of quadratic functions is der ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
Nearestneighbour, linear, and various cubic interpolation functions are frequently used in image resampling. Quadratic functions have been disregarded, largely because they have been thought to introduce phase distortions. This is shown not to be the case, and a family of quadratic functions is derived. The interpolating member of this family has visual quality close to that of the CatmullRom cubic, yet requires only sixty percent of the computation time.
Structured compressed sensing: From theory to applications
 IEEE Trans. Signal Process
, 2011
"... Abstract—Compressed sensing (CS) is an emerging field that has attracted considerable research interest over the past few years. Previous review articles in CS limit their scope to standard discretetodiscrete measurement architectures using matrices of randomized nature and signal models based on ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
Abstract—Compressed sensing (CS) is an emerging field that has attracted considerable research interest over the past few years. Previous review articles in CS limit their scope to standard discretetodiscrete measurement architectures using matrices of randomized nature and signal models based on standard sparsity. In recent years, CS has worked its way into several new application areas. This, in turn, necessitates a fresh look on many of the basics of CS. The random matrix measurement operator must be replaced by more structured sensing architectures that correspond to the characteristics of feasible acquisition hardware. The standard sparsity prior has to be extended to include a much richer class of signals and to encode broader data models, including continuoustime signals. In our overview, the theme is exploiting signal and measurement structure in compressive sensing. The prime focus is bridging theory and practice; that is, to pinpoint the potential of structured CS strategies to emerge from the math to the hardware. Our summary highlights new directions as well as relations to more traditional CS, with the hope of serving both as a review to practitioners wanting to join this emerging field, and as a reference for researchers that attempts to put some of the existing ideas in perspective of practical applications. Index Terms—Approximation algorithms, compressed sensing, compression algorithms, data acquisition, data compression, sampling methods. I.
Image reconstruction by convolution with symmetrical piecewise nthorder polynomial kernels
 IEEE Transactions on Image Processing
, 1999
"... Abstract—The reconstruction of images is an important problem in many applications. From sampling theory it is well known that the sinc function is the ideal interpolation kernel which, however, cannot be used in practice. In order to be able to obtain acceptable reconstructions, both in terms of co ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
Abstract—The reconstruction of images is an important problem in many applications. From sampling theory it is well known that the sinc function is the ideal interpolation kernel which, however, cannot be used in practice. In order to be able to obtain acceptable reconstructions, both in terms of computational speed and mathematical precision, it is required to design a kernel that is of finite extent and resembles the sinc function as much as possible. In this paper, the applicability of a particular class of sincapproximating symmetrical piecewise nthorder polynomial kernels is investigated in satisfying these requirements. After the presentation of the general concept, kernels of first, third, fifth and seventh order are derived. An objective, quantitative evaluation of the reconstruction capabilities of these kernels is obtained by analyzing the spatial and spectral behavior using different measures and by using them to translate, rotate and magnify a number of reallife test images. From the experiments it is concluded that while the improvement of cubic convolution over linear interpolation is significant, the use of higherorder polynomials yields only marginal improvement. Keywords—Interpolation, image reconstruction, image resampling, piecewise polynomial kernels, cubic convolution, quintic convolution, septic convolution. I
A generalized sampling theorem for stable reconstructions in arbitrary bases
 J. Fourier Anal. Appl
"... We introduce a generalized framework for sampling and reconstruction in separable Hilbert spaces. Specifically, we establish that it is always possible to stably reconstruct a vector in an arbitrary Riesz basis from sufficiently many of its samples in any other Riesz basis. This framework can be vie ..."
Abstract

Cited by 12 (12 self)
 Add to MetaCart
We introduce a generalized framework for sampling and reconstruction in separable Hilbert spaces. Specifically, we establish that it is always possible to stably reconstruct a vector in an arbitrary Riesz basis from sufficiently many of its samples in any other Riesz basis. This framework can be viewed as an extension of the wellknown consistent reconstruction technique (Eldar et al). However, whilst the latter imposes stringent assumptions on the reconstruction basis, and may in practice be unstable, our framework allows for recovery in any (Riesz) basis in a manner that is completely stable. Whilst the classical Shannon Sampling Theorem is a special case of our theorem, this framework allows us to exploit additional information about the approximated vector (or, in this case, function), for example sparsity or regularity, to design a reconstruction basis that is better suited. Examples are presented illustrating this procedure.
Nonideal Sampling and Regularization Theory
, 2008
"... Shannon’s sampling theory and its variants provide effective solutions to the problem of reconstructing a signal from its samples in some “shiftinvariant” space, which may or may not be bandlimited. In this paper, we present some further justification for this type of representation, while addressi ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Shannon’s sampling theory and its variants provide effective solutions to the problem of reconstructing a signal from its samples in some “shiftinvariant” space, which may or may not be bandlimited. In this paper, we present some further justification for this type of representation, while addressing the issue of the specification of the best reconstruction space. We consider a realistic setting where a multidimensional signal is prefiltered prior to sampling, and the samples are corrupted by additive noise. We adopt a variational approach to the reconstruction problem and minimize a data fidelity term subject to a Tikhonovlike (continuous domain) 2regularization to obtain the continuousspace solution. We present theoretical justification for the minimization of this cost functional and show that the globally minimal continuousspace solution belongs to a shiftinvariant space generated by a function (generalized Bspline) that is generally not bandlimited. When the sampling is ideal, we recover some of the classical smoothing spline estimators. The optimal reconstruction space is characterized by a condition that links the generating function to the regularization operator and implies the existence of a Bsplinelike basis. To make the scheme practical, we specify the generating functions corresponding to the most popular families of regularization operators (derivatives, iterated Laplacian), as well as a new, generalized one that leads to a new brand of Matérn splines. We conclude the paper by proposing a stochastic interpretation of the reconstruction algorithm and establishing an equivalence with the minimax and minimum mean square error (MMSE/Wiener) solutions of the generalized sampling problem.
Orthogonal Sampling Formulas: A Unified Approach
 SIAM Rev
, 2000
"... Abstract. This paper intends to serve as an educational introduction to sampling theory. Basically, sampling theory deals with the reconstruction of functions (signals) through their values (samples) on an appropriate sequence of points by means of sampling expansions involving these values. In orde ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Abstract. This paper intends to serve as an educational introduction to sampling theory. Basically, sampling theory deals with the reconstruction of functions (signals) through their values (samples) on an appropriate sequence of points by means of sampling expansions involving these values. In order to obtain such sampling expansions in a unified way, we propose an inductive procedure leading to various orthogonal formulas. This procedure, which we illustrate with a number of examples, closely parallels the theory of orthonormal bases in a Hilbert space. All intermediate steps will be described in detail, so that the presentation is selfcontained. The required mathematical background is a basic knowledge of Hilbert space theory. Finally, despite the introductory level, some hints are given on more advanced problems in sampling theory, which we motivate through the examples. Key words. orthonormal bases, sampling expansions, reproducing kernel Hilbert spaces