Results 1  10
of
39
Sampling—50 years after Shannon
 Proceedings of the IEEE
, 2000
"... This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the math ..."
Abstract

Cited by 211 (22 self)
 Add to MetaCart
This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the mathematical connections that were made with wavelet theory. To introduce the reader to the modern, Hilbertspace formulation, we reinterpret Shannon’s sampling procedure as an orthogonal projection onto the subspace of bandlimited functions. We then extend the standard sampling paradigm for a representation of functions in the more general class of “shiftinvariant” functions spaces, including splines and wavelets. Practically, this allows for simpler—and possibly more realistic—interpolation models, which can be used in conjunction with a much wider class of (antialiasing) prefilters that are not necessarily ideal lowpass. We summarize and discuss the results available for the determination of the approximation error and of the sampling rate when the input of the system is essentially arbitrary; e.g., nonbandlimited. We also review variations of sampling that can be understood from the same unifying perspective. These include wavelets, multiwavelets, Papoulis generalized sampling, finite elements, and frames. Irregular sampling and radial basis functions are briefly mentioned. Keywords—Bandlimited functions, Hilbert spaces, interpolation, least squares approximation, projection operators, sampling,
Restoration of a Single Superresolution Image from Several Blurred, Noisy, and Undersampled Measured Images
, 1997
"... The three main tools in the single image restoration theory are the maximum likelihood (ML) estimator, the maximum a posteriori probability (MAP) estimator, and the set theoretic approach using projection onto convex sets (POCS). This paper utilizes the above known tools to propose a unified methodo ..."
Abstract

Cited by 208 (20 self)
 Add to MetaCart
The three main tools in the single image restoration theory are the maximum likelihood (ML) estimator, the maximum a posteriori probability (MAP) estimator, and the set theoretic approach using projection onto convex sets (POCS). This paper utilizes the above known tools to propose a unified methodology toward the more complicated problem of superresolution restoration. In the superresolution restoration problem, an improved resolution image is restored from several geometrically warped, blurred, noisy and downsampled measured images. The superresolution restoration problem is modeled and analyzed from the ML, the MAP, and POCS points of view, yielding a generalization of the known superresolution restoration methods. The proposed restoration approach is general but assumes explicit knowledge of the linear space and timevariant blur, the (additive Gaussian) noise, the different measured resolutions, and the (smooth) motion characteristics. A hybrid method combining the simplicity of the ML and the incorporation of nonellipsoid constraints is presented, giving improved restoration performance, compared with the ML and the POCS approaches. The hybrid method is shown to converge to the unique optimal solution of a new definition of the optimization problem. Superresolution restoration from motionless measurements is also discussed. Simulations demonstrate the power of the proposed methodology.
A Fast SuperResolution Reconstruction Algorithm for Pure Translational Motion and Common SpaceInvariant Blur
, 2001
"... This paper addresses the problem of recovering a superresolved image from a set of warped blurred and decimated versions thereof. Several algorithms have already been proposed for the solution of this general problem. In this paper, we concentrate on a special case where the warps are pure translat ..."
Abstract

Cited by 67 (10 self)
 Add to MetaCart
This paper addresses the problem of recovering a superresolved image from a set of warped blurred and decimated versions thereof. Several algorithms have already been proposed for the solution of this general problem. In this paper, we concentrate on a special case where the warps are pure translations, the blur is space invariant and the same for all the images, and the noise is white. We exploit previous results to develop a new highly efficient superresolution reconstruction algorithm for this case, which separates the treatment into deblurring and measurements fusion. The fusion part is shown to be a very simple noniterative algorithm, preserving the optimality of the entire reconstruction process, in the maximumlikelihood sense. Simulations demonstrate the capabilities of the proposed algorithm.
Interpolation and the Discrete PapoulisGerchberg Algorithm
 IEEE Trans. Signal Processing
, 1994
"... In this paper we analyze the performance of an iterative algorithm, similar to the discrete PaponiisGerchberg algorithm, and which can be used to recover missing samples in finitelength records of bandlimited data. No assumptions are made regarding the distribution of the missing samples, in cont ..."
Abstract

Cited by 32 (20 self)
 Add to MetaCart
In this paper we analyze the performance of an iterative algorithm, similar to the discrete PaponiisGerchberg algorithm, and which can be used to recover missing samples in finitelength records of bandlimited data. No assumptions are made regarding the distribution of the missing samples, in contrast with the often studied extrapolation problem, in which the known samples are grouped together. Indeed, it is possible to regard the observed signal as a sampled version of the original one, and to interpret the reconstruction result studied herein as a sampling result. We show that the iterative algorithm converges if the density of the sampling set exceeds a certain minimum value which naturally increases with the bandwidth of the data. We give upper and lower bounds for the error as a function of the number of iterations, together with the signals for which the bounds are attained. Also, we analyze the effect of a relaxation constant present in the algorithm on the spectral radius of the iteration matrix. From this analysis we infer the optimum value of the relaxation constant. We also point out, among all sampling sets with the same density, those for which the convergence rate of the recovery algorithm is maximum or minimum. For lowpass signals it turns out that the best convergence rates result when the distances among the missing samples are a multiple of a certain integer. The worst convergence rates generally occur when the missing samples are contiguous.
Minimum rate sampling and reconstruction of signals with arbitrary frequency support
 IEEE Trans. Inform. Theory
, 1999
"... Abstract—We examine the question of reconstruction of signals from periodic nonuniform samples. This involves discarding samples from a uniformly sampled signal in some periodic fashion. We give a characterization of the signals that can be reconstructed at exactly the minimum rate once a nonuniform ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
Abstract—We examine the question of reconstruction of signals from periodic nonuniform samples. This involves discarding samples from a uniformly sampled signal in some periodic fashion. We give a characterization of the signals that can be reconstructed at exactly the minimum rate once a nonuniform sampling pattern has been fixed. We give an implicit characterization of the reconstruction system, and a design method by which the ideal reconstruction filters may be approximated. We demonstrate that for certain spectral supports the minimum rate can be approached or achieved using reconstruction schemes of much lower complexity than those arrived at by using spectral slicing, as in earlier work. Previous work on multiband signals have typically been those for which restrictive assumptions on the sizes and positions of the bands have been made, or where the minimum rate was approached asymptotically. We show that the class of multiband signals which can be reconstructed exactly is shown to be far larger than previously considered. When approaching the minimum rate, this freedom allows us, in certain cases to have a far less complex reconstruction system. Index Terms — Multiband, nonuniform, reconstruction, sampling. I.
Filterbank reconstruction of bandlimited signals from nonuniform and generalized samples
 IEEE TRANS. SIGNAL PROCESSING
, 2000
"... This paper introduces a filterbank interpretation of various sampling strategies, which leads to efficient interpolation and reconstruction methods. An identity, which is referred to as the Interpolation Identity, is developed and is used to obtain particularly efficient discretetime systems for i ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
This paper introduces a filterbank interpretation of various sampling strategies, which leads to efficient interpolation and reconstruction methods. An identity, which is referred to as the Interpolation Identity, is developed and is used to obtain particularly efficient discretetime systems for interpolation of generalized samples as well as a class of nonuniform samples, to uniform Nyquist samples, either for further processing in that form or for conversion to continuous time. The Interpolation Identity also leads to new sampling strategies including an extension of Papoulis’ generalized sampling expansion.
Reconstruction of Bandlimited Periodic Nonuniformly Sampled Signals through Multirate Filter Banks
 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS–I: REGULAR PAPERS
, 2003
"... A bandlimited signal can be recovered from its periodic nonuniformly spaced samples provided the average sampling rate is at least the Nyquist rate. A multirate filter bank structure is used to both model this nonuniform sampling (through the analysis bank) and reconstruct a uniformly sampled sequen ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
A bandlimited signal can be recovered from its periodic nonuniformly spaced samples provided the average sampling rate is at least the Nyquist rate. A multirate filter bank structure is used to both model this nonuniform sampling (through the analysis bank) and reconstruct a uniformly sampled sequence (through the synthesis bank). Several techniques for modelling the nonuniform sampling are presented for various cases of sampling. Conditions on the filter bank structure are used to accurately reconstruct uniform samples of the input signal at the Nyquist rate. Several examples and simulation results are presented, with emphasis on forms of nonuniform sampling that may be useful in mixedsignal integrated circuits.
Deinterlacing  an Overview
 PROCEEDINGS OF THE IEEE
, 1998
"... The question “to interlace or not to interlace ” divides the television and the personal computer communities. A proper answer requires a common understanding of what is possible nowadays in deinterlacing video signals. This paper outlines the most relevant proposals, ranging from simple linear meth ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
The question “to interlace or not to interlace ” divides the television and the personal computer communities. A proper answer requires a common understanding of what is possible nowadays in deinterlacing video signals. This paper outlines the most relevant proposals, ranging from simple linear methods to advanced motioncompensated algorithms, and provides a relative performance comparison for 12 of these methods. Next to objective performance indicators, screen photographs have been used to illustrate typical artifacts of individual deinterlacers. The overview provides no final answer in the interlace debate, as such requires unavailable capabilities in balancing technical and nontechnical issues.
Sampling of Bandlimited Functions on Unions of Shifted Lattices
 J. Fourier Anal. Appl
, 2000
"... We consider Shannon sampling theory for sampling sets which are unions of shifted lattices. These sets are not necessarily periodic. A function f can be reconstructed from its samples provided the sampling set and the support of the Fourier transform of f satisfy certain compatibility conditions. Wh ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We consider Shannon sampling theory for sampling sets which are unions of shifted lattices. These sets are not necessarily periodic. A function f can be reconstructed from its samples provided the sampling set and the support of the Fourier transform of f satisfy certain compatibility conditions. While explicit reconstruction formulas are possible, it is most convenient to use a recursive algorithm. The analysis is presented in the general framework of locally compact abelian groups, but several specific examples are given, including a numerical example implemented in MATLAB. 2000 Mathematics Subject Classification: 94A20, 94A12, 43A25, 42B99 Key words: Shannon sampling, multidimensional sampling, nonuniform sampling, periodic sampling, nonperiodic sampling, irregular sampling, locally compact abelian groups. # Mathematics Department, Western Oregon University, Monmouth, Oregon 97361 + Department of Mathematics, Oregon State University, Corvallis, OR 97331. This work was supported by ...