Results 1 
9 of
9
Regularization on graphs with functionadapted diffusion process
, 2006
"... Harmonic analysis and diffusion on discrete data has been shown to lead to stateoftheart algorithms for machine learning tasks, especially in the context of semisupervised and transductive learning. The success of these algorithms rests on the assumption that the function(s) to be studied (learn ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
Harmonic analysis and diffusion on discrete data has been shown to lead to stateoftheart algorithms for machine learning tasks, especially in the context of semisupervised and transductive learning. The success of these algorithms rests on the assumption that the function(s) to be studied (learned, interpolated, etc.) are smooth with respect to the geometry of the data. In this paper we present a method for modifying the given geometry so the function(s) to be studied are smoother with respect to the modified geometry, and thus more amenable to treatment using harmonic analysis methods. Among the many possible applications, we consider the problems of image denoising and transductive classification. In both settings, our approach improves on standard diffusion based methods.
Polynomial operators and local smoothness classes on the unit interval
 Journal of Approximation Theory
"... We prove the existence of quadrature formulas exact for integrating high degree polynomials with respect to Jacobi weights based on scattered data on the unit interval. We also obtain a characterization of local Besov spaces using the coefficients of a tight frame expansion. 1 ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
We prove the existence of quadrature formulas exact for integrating high degree polynomials with respect to Jacobi weights based on scattered data on the unit interval. We also obtain a characterization of local Besov spaces using the coefficients of a tight frame expansion. 1
Quadrature formulas for functions defined on Riemannian manifolds
"... In many practical applications, for example document analysis, semisupervised learning, and inverse problems one is confronted with functions defined on a (Riemannian) manifold imbedded in a high dimensional ambient space. These functions have to be approximated by using only sampled values. Due to ..."
Abstract
 Add to MetaCart
In many practical applications, for example document analysis, semisupervised learning, and inverse problems one is confronted with functions defined on a (Riemannian) manifold imbedded in a high dimensional ambient space. These functions have to be approximated by using only sampled values. Due to several restrictions (experimental setup etc.) we can hardly assume that the sampling nodes are located on a regular grid. This means we have to come up with an approximation process which, on the one hand, can work with scattered data and, on the other hand, has sufficiently good approximation rate. In this talk we will address both problems. We will show under which conditions good localized kernels on a manifold can be constructed and how such kernels can be used to solve the aforementioned problem. The talk is based on joint work with Hrushikesh N. Mhaskar. References [1] F. Filbir, H. N. Mhaskar, A quadrature formula for diffusion polynomials corresponding to a generalized heat kernel, submitted 2009. [2] F. Filbir, H. N. Mhaskar, J. Prestin, On a filter for exponentially localized kernels
Eignets for function approximation on manifolds
, 909
"... Let X be a compact, smooth, connected, Riemannian manifold without boundary, G: X × X → R be P a kernel. Analogous to a radial basis function network, an eignet is an expression of the form M j=1 ajG(◦, yj), where aj ∈ R, yj ∈ X, 1 ≤ j ≤ M. We describe a deterministic, universal algorithm for constr ..."
Abstract
 Add to MetaCart
Let X be a compact, smooth, connected, Riemannian manifold without boundary, G: X × X → R be P a kernel. Analogous to a radial basis function network, an eignet is an expression of the form M j=1 ajG(◦, yj), where aj ∈ R, yj ∈ X, 1 ≤ j ≤ M. We describe a deterministic, universal algorithm for constructing an eignet for approximating functions in L p (µ; X) for a general class of measures µ and kernels G. Our algorithm yields linear operators. Using the minimal separation amongst the centers yj as the cost of approximation, we give modulus of smoothness estimates for the degree of approximation by our eignets, and show by means of a converse theorem that these are the best possible for every individual function. We also give estimates on the coefficients aj in terms of the norm of the eignet. Finally, we demonstrate that if any sequence of eignets satisfies the optimal estimates for the degree of approximation of a smooth function, measured in terms of the minimal separation, then the derivatives of the eignets also approximate the corresponding derivatives of the target function in an optimal manner.
unknown title
"... quadrature formula for diffusion polynomials corresponding to a generalized heat kernel ..."
Abstract
 Add to MetaCart
quadrature formula for diffusion polynomials corresponding to a generalized heat kernel
Contents lists available at ScienceDirect Applied and Computational Harmonic Analysis
"... www.elsevier.com/locate/acha ..."