Results 1  10
of
73
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 396 (33 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Basis Functions approximation schemes. This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, we introduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions, som...
Wavelet Methods For Curve Estimation
, 1994
"... The theory of wavelets is a developing branch of mathematics with a wide range of potential applications. Compactly supported wavelets are particularly interesting because of their natural ability to represent data with intrinsically local properties. They are useful for the detection of edges and ..."
Abstract

Cited by 52 (8 self)
 Add to MetaCart
The theory of wavelets is a developing branch of mathematics with a wide range of potential applications. Compactly supported wavelets are particularly interesting because of their natural ability to represent data with intrinsically local properties. They are useful for the detection of edges and singularities in image and sound analysis, and for data compression. However, most of the wavelet based procedures currently available do not explicitly account for the presence of noise in the data. A discussion of how this can be done in the setting of some simple nonparametric curve estimation problems is given. Wavelet analogues of some familiar kernel and orthogonal series estimators are introduced and their finite sample and asymptotic properties are studied. We discover that there is a fundamental instability in the asymptotic variance of wavelet estimators caused by the lack of translation invariance of the wavelet transform. This is related to the properties of certain lacunary seq...
Piecewisepolynomial regression trees
 Statistica Sinica
, 1994
"... A nonparametric function 1 estimation method called SUPPORT (“Smoothed and Unsmoothed PiecewisePolynomial Regression Trees”) is described. The estimate is typically made up of several pieces, each piece being obtained by fitting a polynomial regression to the observations in a subregion of the data ..."
Abstract

Cited by 48 (8 self)
 Add to MetaCart
A nonparametric function 1 estimation method called SUPPORT (“Smoothed and Unsmoothed PiecewisePolynomial Regression Trees”) is described. The estimate is typically made up of several pieces, each piece being obtained by fitting a polynomial regression to the observations in a subregion of the data space. Partitioning is carried out recursively as in a treestructured method. If the estimate is required to be smooth, the polynomial pieces may be glued together by means of weighted averaging. The smoothed estimate is thus obtained in three steps. In the first step, the regressor space is recursively partitioned until the data in each piece are adequately fitted by a polynomial of a fixed order. Partitioning is guided by analysis of the distributions of residuals and crossvalidation estimates of prediction mean square error. In the second step, the data within a neighborhood of each partition are fitted by a polynomial. The final estimate of the regression function is obtained by averaging the polynomial pieces, using smooth weight functions each of which diminishes rapidly to zero outside its associated partition. Estimates of derivatives of the regression function may be
A general asymptotic scheme for inference under order restrictions
, 2000
"... Limit distributions for the greatest convex minorant and its derivative are considered for a general class of stochastic processes including partial sum processes and empirical processes, for independent, weakly dependent and long range dependent data. The results are applied to isotonic regression, ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
(Show Context)
Limit distributions for the greatest convex minorant and its derivative are considered for a general class of stochastic processes including partial sum processes and empirical processes, for independent, weakly dependent and long range dependent data. The results are applied to isotonic regression, isotonic regression after kernel smoothing, estimation of convex regression functions, and estimation of monotone and convex density functions. Various pointwise limit distributions are obtained, and the rate of convergence depends on the self similarity properties and on the rate of convergence of the processes considered. 1. Introduction. Let {xn}n≥1
Local Polynomial Fitting: A Standard for Nonparametric Regression
, 1993
"... Among the various nonparametric regression methods, weighted local polynomial fitting is the one which is gaining increasing popularity. This is due to the attractive minimax efficiency of the method and to some further desirable properties such as the automatic incorporation of boundary treatment. ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
Among the various nonparametric regression methods, weighted local polynomial fitting is the one which is gaining increasing popularity. This is due to the attractive minimax efficiency of the method and to some further desirable properties such as the automatic incorporation of boundary treatment. In this paper previous results are extended in two directions: in the onedimensional case, not only local linear fitting is considered but also polynomials of other orders and estimating derivatives. In addition to deriving minimax properties, optimal weighting schemes are derived and the solution obtained at the boundary is discussed in some detail. An equivalent. kernel formulation serves as a tool to derive many of these properties. In the higher dimensional case local linear fitting is considered. Properties in terms of minimax efficiency are derived and optimal weighting
Kernelbased regression and objective nonlinear measures to assess brain functioning
, 2001
"... Two di®erent problems of re°ecting brain functioning are addressed. This involves human performance monitoring during the signal detection task and depth of anaesthesia monitoring. The common aspect of both problems is to monitor brain activity through the electroencephalogram recordings on the sc ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Two di®erent problems of re°ecting brain functioning are addressed. This involves human performance monitoring during the signal detection task and depth of anaesthesia monitoring. The common aspect of both problems is to monitor brain activity through the electroencephalogram recordings on the scalp. Although these two problems create only a fractional part of the tasks associated with physiological data analysis the results and the methodology proposed have wider applicability. A theoretical and practical investigation of the di®erent forms of kernelbased nonlinear regression models and e±cient kernelbased algorithms for appropriate features extraction is undertaken. The main focus is on solving the problem of providing reduced variance estimates of the regression coe±cients when a linear regression in some kernel function de¯ned feature space is assumed. To that end Kernel Principal Component Regression and Kernel Partial Least Squares Regression techniques are proposed. These kernelbased techniques were found to be very e±cient when observed data are mapped to a high dimensional feature space where usually algorithms as simple as their
Estimating derivatives for samples of sparsely observed functions, with application to online auction dynamics
 Journal of the American Statistical Association
, 2009
"... It is often of interest to recover derivatives of a sample of random functions from sparse and noisecontaminated measurements, especially when the dynamics of underlying processes is of interest. We propose a novel approach based on estimating derivatives of eigenfunctions and expansions of random ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
(Show Context)
It is often of interest to recover derivatives of a sample of random functions from sparse and noisecontaminated measurements, especially when the dynamics of underlying processes is of interest. We propose a novel approach based on estimating derivatives of eigenfunctions and expansions of random functions into their eigenfunctions to obtain a representation for derivatives. In combination with estimates for functional principal component scores for sparse data, this leads to a viable solution of the challenging problem to recover derivatives for sparsely observed functions. We establish consistency results and demonstrate in simulations that the method is superior to alternative approaches (derivative estimation with random effects models based on Bspline bases, kernel smoothing, smoothing splines, or Psplines). Our study is motivated by an analysis of bidding histories for eBay auctions, where bids are typically very sparse in the middle and somewhat more frequent near the beginning and end of an auction. We demonstrate the estimation of derivatives of price curves for individual auctions from the sparsely observed bidding histories and also derive a modelfree first order differential equation that applies in the case of Gaussian processes. This provides a datadriven dynamic model which we employ to elucidate auction dynamics. KEY WORDS: