Results 1  10
of
132
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 309 (31 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Basis Functions approximation schemes. This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, we introduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions, som...
Regularization networks and support vector machines
 Advances in Computational Mathematics
, 2000
"... Regularization Networks and Support Vector Machines are techniques for solving certain problems of learning from examples – in particular the regression problem of approximating a multivariate function from sparse data. Radial Basis Functions, for example, are a special case of both regularization a ..."
Abstract

Cited by 266 (33 self)
 Add to MetaCart
Regularization Networks and Support Vector Machines are techniques for solving certain problems of learning from examples – in particular the regression problem of approximating a multivariate function from sparse data. Radial Basis Functions, for example, are a special case of both regularization and Support Vector Machines. We review both formulations in the context of Vapnik’s theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics. The emphasis is on regression: classification is treated as a special case.
Splines: A Perfect Fit for Signal/Image Processing
 IEEE SIGNAL PROCESSING MAGAZINE
, 1999
"... ..."
Sampling—50 years after Shannon
 Proceedings of the IEEE
, 2000
"... This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the math ..."
Abstract

Cited by 207 (22 self)
 Add to MetaCart
This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the mathematical connections that were made with wavelet theory. To introduce the reader to the modern, Hilbertspace formulation, we reinterpret Shannon’s sampling procedure as an orthogonal projection onto the subspace of bandlimited functions. We then extend the standard sampling paradigm for a representation of functions in the more general class of “shiftinvariant” functions spaces, including splines and wavelets. Practically, this allows for simpler—and possibly more realistic—interpolation models, which can be used in conjunction with a much wider class of (antialiasing) prefilters that are not necessarily ideal lowpass. We summarize and discuss the results available for the determination of the approximation error and of the sampling rate when the input of the system is essentially arbitrary; e.g., nonbandlimited. We also review variations of sampling that can be understood from the same unifying perspective. These include wavelets, multiwavelets, Papoulis generalized sampling, finite elements, and frames. Irregular sampling and radial basis functions are briefly mentioned. Keywords—Bandlimited functions, Hilbert spaces, interpolation, least squares approximation, projection operators, sampling,
On Edge Detection
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1984
"... Edge detection is the process that attempts to characterize the intensity changes in the image in terms of the physical processes that have originated them. A critical, intermediate goal of edge detection is the detection and characterization of significant intensity changes. This paper discusses th ..."
Abstract

Cited by 175 (6 self)
 Add to MetaCart
Edge detection is the process that attempts to characterize the intensity changes in the image in terms of the physical processes that have originated them. A critical, intermediate goal of edge detection is the detection and characterization of significant intensity changes. This paper discusses this part of the edge d6tection problem. To characterize the types of intensity changes derivatives of different types, and possibly different scales, are needed. Thus, we consider this part of edge detection as a problem in numerical differentiation.
Approximation From ShiftInvariant Subspaces of ...
 Trans. Amer. Math. Soc
, 1991
"... : A complete characterization is given of closed shiftinvariant subspaces of L 2 (IR d ) which provide a specified approximation order. When such a space is principal (i.e., generated by a single function), then this characterization is in terms of the Fourier transform of the generator. As a spe ..."
Abstract

Cited by 130 (38 self)
 Add to MetaCart
: A complete characterization is given of closed shiftinvariant subspaces of L 2 (IR d ) which provide a specified approximation order. When such a space is principal (i.e., generated by a single function), then this characterization is in terms of the Fourier transform of the generator. As a special case, we obtain the classical StrangFix conditions, but without requiring the generating function to decay at infinity. The approximation order of a general closed shiftinvariant space is shown to be already realized by a specifiable principal subspace. AMS (MOS) Subject Classifications: 41A25, 41A63; 41A30, 41A15, 42B99, 46E30 Key Words and phrases: approximation order, StrangFix conditions, shiftinvariant spaces, radial basis functions, orthogonal projection. Authors' affiliation and address: 1 Center for Mathematical Sciences University of WisconsinMadison 610 Walnut St. Madison WI 53705 and 2 Department of Mathematics University of South Carolina Columbia SC 29208 This work...
Interpolation revisited
 IEEE Transactions on Medical Imaging
, 2000
"... Abstract—Based on the theory of approximation, this paper presents a unified analysis of interpolation and resampling techniques. An important issue is the choice of adequate basis functions. We show that, contrary to the common belief, those that perform best are not interpolating. By opposition to ..."
Abstract

Cited by 118 (23 self)
 Add to MetaCart
Abstract—Based on the theory of approximation, this paper presents a unified analysis of interpolation and resampling techniques. An important issue is the choice of adequate basis functions. We show that, contrary to the common belief, those that perform best are not interpolating. By opposition to traditional interpolation, we call their use generalized interpolation; they involve a prefiltering step when correctly applied. We explain why the approximation order inherent in any basis function is important to limit interpolation artifacts. The decomposition theorem states that any basis function endowed with approximation order can be expressed as the convolution of a Bspline of the same order with another function that has none. This motivates the use of splines and splinebased functions as a tunable way to keep artifacts in check without any significant cost penalty. We discuss implementation and performance issues, and we provide experimental evidence to support our claims. Index Terms—Approximation constant, approximation order, Bsplines, Fourier error kernel, maximal order and minimal support (Moms), piecewisepolynomials. I.
BSpline Signal Processing: Part ITheory
 IEEE Trans. Signal Processing
, 1993
"... This paper describes a set of efficient filtering techniques for the processing and representation of signals in terms of continuous Bspline basis functions. We first consider the problem of determining the spline coefficients for an exact signal interpolation (direct Bspline transform). The rever ..."
Abstract

Cited by 116 (24 self)
 Add to MetaCart
This paper describes a set of efficient filtering techniques for the processing and representation of signals in terms of continuous Bspline basis functions. We first consider the problem of determining the spline coefficients for an exact signal interpolation (direct Bspline transform). The reverse operation is the signal reconstruction from its spline coefficients with an optional zooming factor rn (indirect Bspline transform) . We derive general expressions for the z transforms and the equivalent continuous impulse responses of Bspline interpolators of order n. We present simple techniques for signal differentiation and filtering in the transformed domain. We then derive recursive filters that efficiently solve the problems of smoothing spline and least squares approximations. The smoothing spline technique approximates a signal with a complete set of coefficients subject to certain regularization or smoothness constraints. The least squares approach, on the other hand, uses a reduced number of Bspline coefficients with equally spaced nodes; this technique is in many ways analogous to the application of antialiasing lowpass filter prior to decimation in order to represent a signal correctly with a reduced number of samples.