Results 1  10
of
72
Splines: A Perfect Fit for Signal/Image Processing
 IEEE SIGNAL PROCESSING MAGAZINE
, 1999
"... ..."
Multiresolution Modeling: Survey & Future Opportunities
, 1999
"... For twenty years, it has been clear that many datasets are excessively complex for applications such as realtime display, and that techniques for controlling the level of detail of models are crucial. More recently, there has been considerable interest in techniques for the automatic simplificati ..."
Abstract

Cited by 118 (7 self)
 Add to MetaCart
For twenty years, it has been clear that many datasets are excessively complex for applications such as realtime display, and that techniques for controlling the level of detail of models are crucial. More recently, there has been considerable interest in techniques for the automatic simplification of highly detailed polygonal models into faithful approximations using fewer polygons. Several effective techniques for the automatic simplification of polygonal models have been developed in recent years. This report begins with a survey of the most notable available algorithms. Iterative edge contraction algorithms are of particular interest because they induce a certain hierarchical structure on the surface. An overview of this hierarchical structure is presented,including a formulation relating it to minimum spanning tree construction algorithms. Finally, we will consider the most significant directions in which existing simplification methods can be improved, and a summary of o...
Efficient multiscale regularization with applications to the computation of optical flow
 IEEE Trans. Image Process
, 1994
"... AbsfruetA new approach to regularization methods for image processing is introduced and developed using as a vehicle the problem of computing dense optical flow fields in an image sequence. Standard formulations of this problem require the computationally intensive solution of an elliptic partial d ..."
Abstract

Cited by 97 (33 self)
 Add to MetaCart
AbsfruetA new approach to regularization methods for image processing is introduced and developed using as a vehicle the problem of computing dense optical flow fields in an image sequence. Standard formulations of this problem require the computationally intensive solution of an elliptic partial differential equation that arises from the often used “smoothness constraint” ’yl”. regularization. The interpretation of the smoothness constraint is utilized as a “fractal prior ” to motivate regularization based on a recently introduced class of multiscale stochastic models. The solution of the new problem formulation is computed with an efficient multiscale algorithm. Experiments on several image sequences demonstrate the substantial computational savings that can be achieved due to the fact that the algorithm is noniterative and in fact has a per pixel computational complexity that is independent of image size. The new approach also has a number of other important advantages. Specifically, multiresolution flow field estimates are available, allowing great flexibility in dealing with the tradeoff between resolution and accuracy. Multiscale error covariance information is also available, which is of considerable use in assessing the accuracy of the estimates. In particular, these error statistics can be used as the basis for a rational procedure for determining the spatiallyvarying optimal reconstruction resolution. Furthermore, if there are compelling reasons to insist upon a standard smoothness constraint, our algorithm provides an excellent initialization for the iterative algorithms associated with the smoothness constraint problem formulation. Finally, the usefulness of our approach should extend to a wide variety of illposed inverse problems in which variational techniques seeking a “smooth ” solution are generally Used. I.
Interactive SkeletonDriven Dynamic Deformations
 ACM Transactions on Graphics
, 2002
"... This paper presents a framework for the skeletondriven animation of elastically deformable characters. A character is embedded in a coarse volumetric control lattice, which provides the structure needed to apply the finite element method. To incorporate skeletal controls, we introduce line constrai ..."
Abstract

Cited by 74 (1 self)
 Add to MetaCart
This paper presents a framework for the skeletondriven animation of elastically deformable characters. A character is embedded in a coarse volumetric control lattice, which provides the structure needed to apply the finite element method. To incorporate skeletal controls, we introduce line constraints along the bones of simple skeletons. The bones are made to coincide with edges of the control lattice, which enables us to apply the constraints efficiently using algebraic methods. To accelerate computation, we associate regions of the volumetric mesh with particular bones and perform locally linearized simulations, which are blended at each time step. We define a hierarchical basis on the control lattice, so for detailed interactions the simulation can adapt the level of detail. We demonstrate the ability to animate complex models using simple skeletons and coarse volumetric meshes in a manner that simulates secondary motions at interactive rates.
A chronology of interpolation: From ancient astronomy to modern signal and image processing
 Proceedings of the IEEE
, 2002
"... This paper presents a chronological overview of the developments in interpolation theory, from the earliest times to the present date. It brings out the connections between the results obtained in different ages, thereby putting the techniques currently used in signal and image processing into histo ..."
Abstract

Cited by 61 (0 self)
 Add to MetaCart
This paper presents a chronological overview of the developments in interpolation theory, from the earliest times to the present date. It brings out the connections between the results obtained in different ages, thereby putting the techniques currently used in signal and image processing into historical perspective. A summary of the insights and recommendations that follow from relatively recent theoretical as well as experimental studies concludes the presentation. Keywords—Approximation, convolutionbased interpolation, history, image processing, polynomial interpolation, signal processing, splines. “It is an extremely useful thing to have knowledge of the true origins of memorable discoveries, especially those that have been found not by accident but by dint of meditation. It is not so much that thereby history may attribute to each man his own discoveries and others should be encouraged to earn like commendation, as that the art of making discoveries should be extended by considering noteworthy examples of it. ” 1 I.
A Multiresolution Framework for Dynamic Deformations
, 2002
"... We present a novel framework for dynamic simulation of elastically deformable solids. Our approach combines classical finite element methodology with subdivision wavelets to meet the needs of computer graphics applications. We represent deformations using a wavelet basis constructed from volumetric ..."
Abstract

Cited by 59 (2 self)
 Add to MetaCart
We present a novel framework for dynamic simulation of elastically deformable solids. Our approach combines classical finite element methodology with subdivision wavelets to meet the needs of computer graphics applications. We represent deformations using a wavelet basis constructed from volumetric CatmullClark subdivision. CatmullClark subdivision solids allow the domain of deformation to be tailored to objects of arbitrary topology. The domain of deformation can correspond to the interior of a subdivision surface or can enclose an arbitrary surface mesh. Within the wavelet framework we develop the equations of motion for elastic deformations in the presence of external forces and constraints. We solve the resulting differential equations using an implicit method, which lends stability. Our framework allows tradeoff between speed and accuracy. For interactive applications, we accelerate the simulation by adaptively refining the wavelet basis while avoiding visual "popping" artifacts. Offline simulations can employ a fine basis for higher accuracy at the cost of more computation time. By exploiting the properties of smooth subdivision we can compute less expensive solutions using a trilinear basis yet produce a smooth result that meets the constraints.
Cardinal exponential splines: Part I—Theory and filtering algorithms
 IEEE Trans. Signal Process
, 2005
"... Abstract—Causal exponentials play a fundamental role in classical system theory. Starting from those elementary building blocks, we propose a complete and selfcontained signal processing formulation of exponential splines defined on a uniform grid. We specify the corresponding Bspline basis functi ..."
Abstract

Cited by 35 (13 self)
 Add to MetaCart
Abstract—Causal exponentials play a fundamental role in classical system theory. Starting from those elementary building blocks, we propose a complete and selfcontained signal processing formulation of exponential splines defined on a uniform grid. We specify the corresponding Bspline basis functions and investigate their reproduction properties (Green function and exponential polynomials); we also characterize their stability (Riesz bounds). We show that the exponential Bspline framework allows an exact implementation of continuoustime signal processing operators including convolution, differential operators, and modulation, by simple processing in the discrete Bspline domain. We derive efficient filtering algorithms for multiresolution signal extrapolation and approximation, extending earlier results for polynomial splines. Finally, we present a new asymptotic error formula that predicts the magnitude and the thorder decay of the Papproximation error as a function of the knot spacing. Index Terms—Continuoustime signal processing, convolution, differential operators, Green functions, interpolation, modulation, multiresolution approximation, splines. I.
Image Processing with Multiscale Stochastic Models
, 1993
"... In this thesis, we develop image processing algorithms and applications for a particular class of multiscale stochastic models. First, we provide background on the model class, including a discussion of its relationship to wavelet transforms and the details of a twosweep algorithm for estimation. A ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
In this thesis, we develop image processing algorithms and applications for a particular class of multiscale stochastic models. First, we provide background on the model class, including a discussion of its relationship to wavelet transforms and the details of a twosweep algorithm for estimation. A multiscale model for the error process associated with this algorithm is derived. Next, we illustrate how the multiscale models can be used in the context of regularizing illposed inverse problems and demonstrate the substantial computational savings that such an approach offers. Several novel features of the approach are developed including a technique for choosing the optimal resolution at which to recover the object of interest. Next, we show that this class of models contains other widely used classes of statistical models including 1D Markov processes and 2D Markov random fields, and we propose a class of multiscale models for approximately representing Gaussian Markov random fields...
Generalized smoothing splines and the optimal discretization of the Wiener filter
 IEEE Trans. Signal Process
, 2005
"... Abstract—We introduce an extended class of cardinal L Lsplines, where L is a pseudodifferential operator satisfying some admissibility conditions. We show that the L Lspline signal interpolation problem is well posed and that its solution is the unique minimizer of the spline energy functional L ..."
Abstract

Cited by 27 (14 self)
 Add to MetaCart
Abstract—We introduce an extended class of cardinal L Lsplines, where L is a pseudodifferential operator satisfying some admissibility conditions. We show that the L Lspline signal interpolation problem is well posed and that its solution is the unique minimizer of the spline energy functional L P, subject to the interpolation constraint. Next, we consider the corresponding regularized least squares estimation problem, which is more appropriate for dealing with noisy data. The criterion to be minimized is the sum of a quadratic data term, which forces the solution to be close to the input samples, and a “smoothness” term that privileges solutions with small spline energies. Here, too, we find that the optimal solution, among all possible functions, is a cardinal L Lspline. We show that this smoothing spline estimator has a stable representation in a Bsplinelike basis and that its coefficients can be computed by digital filtering of the input signal. We describe an efficient recursive filtering algorithm that is applicable whenever the transfer function of L is rational (which corresponds to the case of exponential splines). We justify these algorithms statistically by establishing an equivalence between L L smoothing splines and the minimum mean square error (MMSE) estimation of a stationary signal corrupted by white Gaussian noise. In this modelbased formulation, the optimum operator L is the whitening filter of the process, and the regularization parameter is proportional to the noise variance. Thus, the proposed formalism yields the optimal discretization of the classical Wiener filter, together with a fast recursive algorithm. It extends the standard Wiener solution by providing the optimal interpolation space. We also present a Bayesian interpretation of the algorithm. Index Terms—Nonparametric estimation, recursive filtering, smoothing splines, splines (polynomial and exponential), stationary processes, variational principle, Wiener filter. I.
Ten Good Reasons For Using Spline Wavelets
 Proc. SPIE vol. 3169, Wavelet Applications in Signal and Image Processing V
, 1997
"... The purpose of this note is to highlight some of the unique properties of spline wavelets. These wavelets can be classified in four categories: othogonal (BattleLemari), semiorthogonal (e.g., Bspline), shiftorthogonal, and biorthogonal (CohenDaubechiesFeauveau) . Unlike most other wavelet bases ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
The purpose of this note is to highlight some of the unique properties of spline wavelets. These wavelets can be classified in four categories: othogonal (BattleLemari), semiorthogonal (e.g., Bspline), shiftorthogonal, and biorthogonal (CohenDaubechiesFeauveau) . Unlike most other wavelet bases, splines have explicit formulae in both the time and frequency domain, which greatly facilitates their manipulation. They allow for a progressive transition between the two extreme cases of a multiresolution: Haar's piecewise constant representation (spline of degree zero) versus Shannon's bandlimited model (which corresponds to a spline of infinite order). Spline wavelets are extremely regular and usually symmetric or antisymmetric. They can be designed to have compact support and to achieve optimal timefrequency localization (Bspline wavelets). The underlying scaling functions are the Bsplines, which are the shortest and most regular scaling functions of order L. Finally, splines have the best approximation properties among all known wavelets of a given order L. In other words, they are the best for approximating smooth functions.