Results 1  10
of
70
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 309 (31 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Basis Functions approximation schemes. This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, we introduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions, som...
Regularization networks and support vector machines
 Advances in Computational Mathematics
, 2000
"... Regularization Networks and Support Vector Machines are techniques for solving certain problems of learning from examples – in particular the regression problem of approximating a multivariate function from sparse data. Radial Basis Functions, for example, are a special case of both regularization a ..."
Abstract

Cited by 266 (33 self)
 Add to MetaCart
Regularization Networks and Support Vector Machines are techniques for solving certain problems of learning from examples – in particular the regression problem of approximating a multivariate function from sparse data. Radial Basis Functions, for example, are a special case of both regularization and Support Vector Machines. We review both formulations in the context of Vapnik’s theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics. The emphasis is on regression: classification is treated as a special case.
The Connection between Regularization Operators and Support Vector Kernels
, 1998
"... In this paper a correspondence is derived between regularization operators used in Regularization Networks and Support Vector Kernels. We prove that the Green's Functions associated with regularization operators are suitable Support Vector Kernels with equivalent regularization properties. Moreover ..."
Abstract

Cited by 146 (43 self)
 Add to MetaCart
In this paper a correspondence is derived between regularization operators used in Regularization Networks and Support Vector Kernels. We prove that the Green's Functions associated with regularization operators are suitable Support Vector Kernels with equivalent regularization properties. Moreover the paper provides an analysis of currently used Support Vector Kernels in the view of regularization theory and corresponding operators associated with the classes of both polynomial kernels and translation invariant kernels. The latter are also analyzed on periodical domains. As a byproduct we show that a large number of Radial Basis Functions, namely conditionally positive definite functions, may be used as Support Vector kernels.
Scattered Data Interpolation with Multilevel Splines
 IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
, 1997
"... This paper describes a fast algorithm for scattered data interpolation and approximation. Multilevel Bsplines are introduced to compute a C²continuous surface through a set of irregularly spaced points. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequen ..."
Abstract

Cited by 106 (9 self)
 Add to MetaCart
This paper describes a fast algorithm for scattered data interpolation and approximation. Multilevel Bsplines are introduced to compute a C²continuous surface through a set of irregularly spaced points. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequence of bicubic Bspline functions whose sum approaches the desired interpolation function. Large performance gains are realized by using Bspline refinement to reduce the sum of these functions into one equivalent Bspline function. Experimental results demonstrate that highfidelity reconstruction is possible from a selected set of sparse and irregular samples.
Error Estimates and Condition Numbers for Radial Basis Function Interpolation
 Adv. Comput. Math
, 1994
"... : For interpolation of scattered multivariate data by radial basis functions, an "uncertainty relation" between the attainable error and the condition of the interpolation matrices is proven. It states that the error and the condition number cannot both be kept small. Bounds on the Lebesgue constant ..."
Abstract

Cited by 80 (20 self)
 Add to MetaCart
: For interpolation of scattered multivariate data by radial basis functions, an "uncertainty relation" between the attainable error and the condition of the interpolation matrices is proven. It states that the error and the condition number cannot both be kept small. Bounds on the Lebesgue constants are obtained as a byproduct. A variation of the NarcowichWard theory of upper bounds on the norm of the inverse of the interpolation matrix is presented in order to handle the whole set of radial basis functions that are currently in use. 1 Introduction Interpolation by "radial" basis functions requires a function \Phi : IR d ! IR, a space IP d m of dvariate polynomials of degree less than m, and interpolates data values y 1 ; . . . ; yN 2 IR at data locations ("centers") x 1 ; . . . ; xN 2 IR d by solving the system N X j=1 ff j \Phi(x j \Gamma x k ) + Q X `=1 fi ` p ` (x k ) = y k ; 1 k N N X j=1 ff j p i (x j ) + 0 = 0; 1 i Q (1:1) for a basis p 1 ; . . . ; pQ...
Priors, Stabilizers and Basis Functions: from regularization to radial, tensor and additive splines
, 1993
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular we had discussed how standard smoothness functionals lead to a subclass of regularization networks, th ..."
Abstract

Cited by 79 (14 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular we had discussed how standard smoothness functionals lead to a subclass of regularization networks, the wellknown Radial Basis Functions approximation schemes. In this paper weshow that regularization networks encompass amuch broader range of approximation schemes, including many of the popular general additivemodels and some of the neural networks. In particular weintroduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same extension that leads from Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additivemodels to ridge approximation models, containing as special cases Breiman's hinge functions and some forms of Projection Pursuit Regression. We propose to use the term GeneralizedRegularization Networks for this broad class of approximation schemes that follow from an extension of regularization. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to differenttypes of smoothness assumptions. In the final part of the paper, weshow the relation between activation functions of the Gaussian and sigmoidal type by considering the simple case of the kernel G(x)=x. In summary,
On a Kernelbased Method for Pattern Recognition, Regression, Approximation, and Operator Inversion
, 1997
"... We present a Kernelbased framework for Pattern Recognition, Regression Estimation, Function Approximation and multiple Operator Inversion. Previous approaches such as ridgeregression, Support Vector methods and regression by Smoothing Kernels are included as special cases. We will show connection ..."
Abstract

Cited by 77 (25 self)
 Add to MetaCart
We present a Kernelbased framework for Pattern Recognition, Regression Estimation, Function Approximation and multiple Operator Inversion. Previous approaches such as ridgeregression, Support Vector methods and regression by Smoothing Kernels are included as special cases. We will show connections between the costfunction and some properties up to now believed to apply to Support Vector Machines only. The optimal solution of all the problems described above can be found by solving a simple quadratic programming problem. The paper closes with a proof of the equivalence between Support Vector kernels and Greene's functions of regularization operators.
Image Warping by Radial Basis Functions: Application to Facial Expressions
 CVGIP: Graphical Models and Image Processing
, 1994
"... The human face is an elastic object. A natural paradigm for representing facial expressions is to form a complete 3D model of facial muscles and tissues. However, determining the actual parameter values for synthesizing and animating facial expressions is tedious; evaluating these parameters for fac ..."
Abstract

Cited by 64 (3 self)
 Add to MetaCart
The human face is an elastic object. A natural paradigm for representing facial expressions is to form a complete 3D model of facial muscles and tissues. However, determining the actual parameter values for synthesizing and animating facial expressions is tedious; evaluating these parameters for facial expression analysis out of greylevel images is ahead of the state of the art in computer vision. Using only 2D face images and a small number of anchor points, we show that the method of radial basis functions provides a powerful mechanism for processing facial expressions. Although constructed specifically for facial expressions, our method is applicable to other elastic objects as well.
Multistep scattered data interpolation using compactly supported radial basis functions
 J. Comp. Appl. Math
, 1996
"... Abstract. A hierarchical scheme is presented for smoothly interpolating scattered data with radial basis functions of compact support. A nested sequence of subsets of the data is computed efficiently using successive Delaunay triangulations. The scale of the basis function at each level is determine ..."
Abstract

Cited by 64 (12 self)
 Add to MetaCart
Abstract. A hierarchical scheme is presented for smoothly interpolating scattered data with radial basis functions of compact support. A nested sequence of subsets of the data is computed efficiently using successive Delaunay triangulations. The scale of the basis function at each level is determined from the current density of the points using information from the triangulation. The method is rotationally invariant and has good reproduction properties. Moreover the solution can be calculated and evaluated in acceptable computing time. During the last two decades radial basis functions have become a well established tool for multivariate interpolation of both scattered and gridded data; see [2,7,8,22,25] for some surveys. The major part