Results 1  10
of
20
Solving Partial Differential Equations by Collocation with Radial Basis Functions
 In: Surface Fitting and Multiresolution Methods A. Le M'ehaut'e, C. Rabut and L.L. Schumaker (eds.), Vanderbilt
, 1997
"... . Motivated by [5] we describe a method related to scattered Hermite interpolation for which the solution of elliptic partial differential equations by collocation is wellposed. We compare the method of [5] with our method. x1. Introduction In this paper we discuss the numerical solution of ellipt ..."
Abstract

Cited by 37 (12 self)
 Add to MetaCart
. Motivated by [5] we describe a method related to scattered Hermite interpolation for which the solution of elliptic partial differential equations by collocation is wellposed. We compare the method of [5] with our method. x1. Introduction In this paper we discuss the numerical solution of elliptic partial differential equations using a collocation approach based on radial basis functions. To make the discussion transparent we will focus on the case of a time independent linear elliptic partial differential equation in IR 2 . In the following we assume we are given a set of nodes \Xi = f ~ ¸ 1 ; : : : ; ~ ¸ N g ae IR d , along with a continuous function ' : [0; 1) ! IR. We then refer to ~x 7! '(k~x\Gamma ~ ¸ k k 2 ), ~x 2 IR d , k 2 f1; : : : ; Ng, as radial basis functions centered at ~ ¸ k . Some of the most commonly used radial basis functions are the (reciprocal) multiquadrics '(r) = (r 2 + c 2 ) \Sigma1=2 , the Gaussians '(r) = e \Gammacr 2 , and the thin pla...
Multivariate Interpolation and Approximation by Translates of a Basis Function
, 1995
"... . This contribution will touch the following topics: ffl Short introduction into the theory of multivariate interpolation and approximation by finitely many (irregular) translates of a (not necessarily radial) basis function, motivated by optimal recovery of functions from discrete samples. ffl Na ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
. This contribution will touch the following topics: ffl Short introduction into the theory of multivariate interpolation and approximation by finitely many (irregular) translates of a (not necessarily radial) basis function, motivated by optimal recovery of functions from discrete samples. ffl Native spaces of functions associated to conditionally positive definite functions, and relations between such spaces. ffl Error bounds and condition numbers for interpolation of functions from native spaces. ffl Uncertainty Relation: Why are good error bounds always tied to bad condition numbers? ffl Shift and Scale: How to cope with the Uncertainty Relation? x1. Introduction and Overview This contribution contains the author's view of a certain area of multivariate interpolation and approximation. It is not intended to be a complete survey of a larger area of research, and it will not account for the history of the theory it deals with. Related surveys are [15, 21, 22, 27, 30, 47, 48, 58...
Kernel Techniques: From Machine Learning to Meshless Methods
, 2006
"... Kernels are valuable tools in various fields of Numerical Analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and Machine Learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
Kernels are valuable tools in various fields of Numerical Analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and Machine Learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers the links between them, as far as they are related to kernel techniques. It addresses nonexpert readers and focuses on practical guidelines for using kernels in applications.
Solving Differential Equations with Radial Basis Functions: Multilevel Methods and Smoothing
 Advances in Comp. Math
"... . Some of the meshless radial basis function methods used for the numerical solution of partial differential equations are reviewed. In particular, the differences between globally and locally supported methods are discussed, and for locally supported methods the important role of smoothing within a ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
. Some of the meshless radial basis function methods used for the numerical solution of partial differential equations are reviewed. In particular, the differences between globally and locally supported methods are discussed, and for locally supported methods the important role of smoothing within a multilevel framework is demonstrated. A possible connection between multigrid finite elements and multilevel radial basis function methods with smoothing is explored. Various numerical examples are also provided throughout the paper. 1. Introduction During the past few years the idea of using socalled meshless methods for the numerical solution of partial differential equations (PDEs) has received much attention throughout the scientific community. As a few representative examples we mention Belytschko and coworker's results [3] using the socalled elementfree Galerkin method, Duarte and Oden's work [11] using hp clouds, Babuska and Melenk 's work [1] on the partition of unity method, ...
Variational principles and Sobolevtype estimates for generalized interpolation on a Riemannian manifold
 Constr. Approx
, 1999
"... ..."
Kernels for VectorValued Functions: a Review
, 2011
"... Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kern ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
Compactly Supported Radial Basis Functions for Shallow Water Equations
 Appl. Math. Comput
"...  This paper presents the application of the compactly supported radial basis functions (CSRBFs) in solving a system of shallow water hydrodynamics equations. The proposed scheme is derived from the idea of piecewise polynomial interpolation using a function of Euclidean distance. The compactly supp ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
 This paper presents the application of the compactly supported radial basis functions (CSRBFs) in solving a system of shallow water hydrodynamics equations. The proposed scheme is derived from the idea of piecewise polynomial interpolation using a function of Euclidean distance. The compactly supported basis functions consist of a polynomial which are nonzero on [0; 1) and vanish on [1; 1). This reduces the original resultant full matrix to a sparse matrix. The operation of the banded matrix system could reduce the illconditioning of the resultant coecient matrix due to the use of the global radial basis functions. To illustrate the computational eciency and accuracy of the method, the dierence between the globally and compactly supported radial basis function schemes is compared. The resulting banded matrix has shown improvement in both illconditioning and computational eciency. The numerical solutions are veried with the observed data. Excellent agreement is shown between the ...
Hermite Interpolation with Radial Basis Functions on Spheres
 Adv. Comput. Math
, 1999
"... . We show how conditionally negative definite functions on spheres coupled with strictly completely monotone functions (or functions whose derivative is strictly completely monotone) can be used for Hermite interpolation. The classes of functions thus obtained have the advantage over the strictly po ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
. We show how conditionally negative definite functions on spheres coupled with strictly completely monotone functions (or functions whose derivative is strictly completely monotone) can be used for Hermite interpolation. The classes of functions thus obtained have the advantage over the strictly positive definite functions studied in [17] that closed form representations (as opposed to series expansions) are readily available. Furthermore, our functions include the historically significant spherical multiquadrics. Numerical results are also presented. AMS classification: 41A05, 41A63, 42A82. Key words and phrases: Spherical interpolation, Hermite interpolation, Radial basis functions. 1. Introduction In 1975 R. Hardy mentioned the possibility of using multiquadric basis functions for Hermite interpolation (see [10], or the survey paper [11]). This problem, however, was not further investigated until the paper [29] by Wu appeared. Since then, the interest in this topic seems to have ...
Recent Developments in Approximation via Positive Definite Functions
 in Approximation Theory IX
, 1998
"... . Positive and conditionally positive definite functions, especially radial basis functions and similar functions for spheres, tori, and even Riemannian manifolds, are of interest because of the their wellknown ability to synthesize a good surface fit from scattered data. More recently, positive de ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
. Positive and conditionally positive definite functions, especially radial basis functions and similar functions for spheres, tori, and even Riemannian manifolds, are of interest because of the their wellknown ability to synthesize a good surface fit from scattered data. More recently, positive definite basis functions have been employed to analyze scattered data. The methods used to do this involve constructing multiresolution analyses or multilevel approximations. This paper will discuss recent developments in the synthesis and analysis problems, point out new directions in their investigation, and remark on applications. x1. Introduction Positive definite and conditionally positive definite functions and kernels are used in areas that require fitting a surface to data taken at scattered points in Euclidean space or on some surface, a sphere or torus, say. When the underlying space is Euclidean, radial basis functions (RBFs) e.g., Gaussians, multiquadrics, and thinplate spline...