Results 1  10
of
12
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 309 (31 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Basis Functions approximation schemes. This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, we introduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions, som...
An equivalence between sparse approximation and Support Vector Machines
 A.I. Memo 1606, MIT Arti cial Intelligence Laboratory
, 1997
"... This publication can be retrieved by anonymous ftp to publications.ai.mit.edu. The pathname for this publication is: aipublications/15001999/AIM1606.ps.Z This paper shows a relationship between two di erent approximation techniques: the Support Vector Machines (SVM), proposed by V.Vapnik (1995), ..."
Abstract

Cited by 205 (7 self)
 Add to MetaCart
This publication can be retrieved by anonymous ftp to publications.ai.mit.edu. The pathname for this publication is: aipublications/15001999/AIM1606.ps.Z This paper shows a relationship between two di erent approximation techniques: the Support Vector Machines (SVM), proposed by V.Vapnik (1995), and a sparse approximation scheme that resembles the Basis Pursuit DeNoising algorithm (Chen, 1995 � Chen, Donoho and Saunders, 1995). SVM is a technique which can be derived from the Structural Risk Minimization Principle (Vapnik, 1982) and can be used to estimate the parameters of several di erent approximation schemes, including Radial Basis Functions, algebraic/trigonometric polynomials, Bsplines, and some forms of Multilayer Perceptrons. Basis Pursuit DeNoising is a sparse approximation technique, in which a function is reconstructed by using a small number of basis functions chosen from a large set (the dictionary). We show that, if the data are noiseless, the modi ed version of Basis Pursuit DeNoising proposed in this paper is equivalent to SVM in the following sense: if applied to the same data set the two techniques give the same solution, which is obtained by solving the same quadratic programming problem. In the appendix we also present a derivation of the SVM technique in the framework of regularization theory, rather than statistical learning theory, establishing a connection between SVM, sparse approximation and regularization theory.
Kernel Techniques: From Machine Learning to Meshless Methods
, 2006
"... Kernels are valuable tools in various fields of Numerical Analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and Machine Learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
Kernels are valuable tools in various fields of Numerical Analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and Machine Learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers the links between them, as far as they are related to kernel techniques. It addresses nonexpert readers and focuses on practical guidelines for using kernels in applications.
Approximate Interpolation with Applications to Selecting Smoothing Parameters
, 2005
"... In this paper, we study the global behavior of a function that is known to be small at a given discrete data set. Such a function might be interpreted as the error function between an unknown function and a given approximant. We will show that a small error on the discrete data set leads under mild ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
In this paper, we study the global behavior of a function that is known to be small at a given discrete data set. Such a function might be interpreted as the error function between an unknown function and a given approximant. We will show that a small error on the discrete data set leads under mild assumptions automatically to a small error on a larger region. We will apply these results to spline smoothing and show that a specific, a priori choice of the smoothing parameter is possible and leads to the same approximation order as the classical interpolant. This has also a surprising application in stabilizing the interpolation process by splines and positive definite kernels.
A Survey on Spherical Spline Approximation
 Surveys Math. Indust
, 1997
"... Spline functions that approximate data given on the sphere are developed in a weighted Sobolev space setting. The flexibility of the weights makes possible the choice of the approximating function in a way which emphasizes attributes desirable for the particular application area. Examples show that ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
Spline functions that approximate data given on the sphere are developed in a weighted Sobolev space setting. The flexibility of the weights makes possible the choice of the approximating function in a way which emphasizes attributes desirable for the particular application area. Examples show that certain choices of the weight sequences yield known methods. A convergence theorem containing explicit constants yields a usable error bound. Our survey ends with the discussion of spherical splines in geodetically relevant pseudodifferential equations. (submitted to "Surveys on Mathematics for Industry") AMS classification: 41A05, 43A90, 65D07, 86A30 Keywords: spherical splines, scattered data interpolation, smoothing, geoid determination Contents 1 Introduction 3 2 Preliminaries 4 3 Sobolev Spaces and Pseudodifferential Operators 7 4 Sobolev Lemma and Reproducing Kernel Sobolev Spaces 10 5 Examples of Radial Basis Functions 13 5.1 Green's Kernels Corresponding to Iterated Beltrami Oper...
Fast Generalised Cross Validation
"... The task of fitting smoothing spline surfaces to meteorological data such as temperature or rainfall observations is computationally intensive. The Generalised Cross Validation (GCV) smoothing algorithm, if implemented using direct matrix techniques, is O(n 3 ) computationally, and memory requirem ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The task of fitting smoothing spline surfaces to meteorological data such as temperature or rainfall observations is computationally intensive. The Generalised Cross Validation (GCV) smoothing algorithm, if implemented using direct matrix techniques, is O(n 3 ) computationally, and memory requirements are O(n 2 ). Thus, for data sets larger than a few hundred observations, the algorithm is prohibitively slow. The core of the algorithm consists of solving series of shifted linear systems, and iterative techniques have been used to lower the computational complexity and facilitate implementation on a variety of supercomputer architectures. For large data sets though, the execution time is still quite high. In this paper we describe a Lanczos based approach which avoids explicitly solving the linear systems and dramatically reduces the amount of time required to fit surfaces to sets of data.
Wavelet Interpolation Networks
 Preprint, Centre de Mathematiques Appliquees, Ecole Polytechnique
, 1998
"... Abstract. We describe a new approach to real time learning of unknown functions based on an interpolating wavelet estimation. We choose a subfamily of a wavelet basis relying on nested hierarchical allocation and update in real time our estimate of the unknown function. Such an interpolation process ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. We describe a new approach to real time learning of unknown functions based on an interpolating wavelet estimation. We choose a subfamily of a wavelet basis relying on nested hierarchical allocation and update in real time our estimate of the unknown function. Such an interpolation process can be used for real time applications like neural network adaptive control, where learning an unknown function very fast is critical. 1.
Getting better contour plots with S and GCVPACK
, 1990
"... Abstract: We show how to obtain esthetically pleasing contour plots using New S and GCVPACK. With these codes, thin plate splines can easily be used to interpolate “exact ” data, and to produce smoothly varying contour plots, with none of the jagged corners that plague many other interpolation metho ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract: We show how to obtain esthetically pleasing contour plots using New S and GCVPACK. With these codes, thin plate splines can easily be used to interpolate “exact ” data, and to produce smoothly varying contour plots, with none of the jagged corners that plague many other interpolation methods. It is noted that GCVPACK can also be used to interpolate data on the sphere and in Euclidean three space. We observe that a larger class of global interpolation methods (including the thin plate spline) have a Bayesian interpretation, and GCVPACK can be used to compute them.
Comments to Chong Gu, `Model Indexing and Smoothing Parameter Selection in Nonparametric Function Estimation
"... ontrol , control J(f) (here the prototypical J(f) = R [f 00 (x)] 2 dx, or control the residual sum of squares. J(f) is controlled by finding f in a certain Sobolev Hilbert space to minimize the residual sum of squares (RSS) under the constraint that J(f) C, then C is the regularization para ..."
Abstract
 Add to MetaCart
ontrol , control J(f) (here the prototypical J(f) = R [f 00 (x)] 2 dx, or control the residual sum of squares. J(f) is controlled by finding f in a certain Sobolev Hilbert space to minimize the residual sum of squares (RSS) under the constraint that J(f) C, then C is the regularization parameter. RSS is varied by finding f to minimize J(f) subject to RSS S, then S is the regularization parameter. Under some mild conditions J(f) = C; RSS = S, and ; C and S are equivalent in the sense that control
Splines in Statistics*
"... ABSTRACT. Spline functions are particularly appropriate in fitting a smooth nonparametric model to noisy data. The usc of spline functions in nonparametric density estimation and spectral estimation is surveyed. The requisite spline theory background is also developed. Key Words and Phrases. ..."
Abstract
 Add to MetaCart
ABSTRACT. Spline functions are particularly appropriate in fitting a smooth nonparametric model to noisy data. The usc of spline functions in nonparametric density estimation and spectral estimation is surveyed. The requisite spline theory background is also developed. Key Words and Phrases.