Results 1 
9 of
9
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
MemoryBased Neural Networks For Robot Learning
 Neurocomputing
, 1995
"... This paper explores a memorybased approach to robot learning, using memorybased neural networks to learn models of the task to be performed. Steinbuch and Taylor presented neural network designs to explicitly store training data and do nearest neighbor lookup in the early 1960s. In this paper their ..."
Abstract

Cited by 31 (8 self)
 Add to MetaCart
(Show Context)
This paper explores a memorybased approach to robot learning, using memorybased neural networks to learn models of the task to be performed. Steinbuch and Taylor presented neural network designs to explicitly store training data and do nearest neighbor lookup in the early 1960s. In this paper their nearest neighbor network is augmented with a local model network, which fits a local model to a set of nearest neighbors. This network design is equivalent to a statistical approach known as locally weighted regression, in which a local model is formed to answer each query, using a weighted regression in which nearby points (similar experiences) are weighted more than distant points (less relevant experiences). We illustrate this approach by describing how it has been used to enable a robot to learn a difficult juggling task. Keywords: memorybased, robot learning, locally weighted regression, nearest neighbor, local models. 1 Introduction An important problem in motor learning is approxim...
On a Universal Strong Law of Large Numbers for Conditional Expectations
 Bernoulli
, 1998
"... A number of generalizations of the Kolmogorov Strong Law of Large Numbers (SLLN) are known including convex combinations of r.v's with random coefficients. In the case of pairs of i.i.d. r.v's (X 1 ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A number of generalizations of the Kolmogorov Strong Law of Large Numbers (SLLN) are known including convex combinations of r.v's with random coefficients. In the case of pairs of i.i.d. r.v's (X 1
k. Results indicate that this procedure is very effective in estimating good feature weights (Table 4.8). Particularly the results obtained in the
, 1994
"... ancebased algorithms to compute distances that may not reflect the optimal distance between two data points. For example, two input features may be identical. The effect of these two identical input features is equivalent to a single feature with twice the weight during distance calculations. The f ..."
Abstract
 Add to MetaCart
ancebased algorithms to compute distances that may not reflect the optimal distance between two data points. For example, two input features may be identical. The effect of these two identical input features is equivalent to a single feature with twice the weight during distance calculations. The feature's larger weight is only justified if it contains more information with respect to the desired outputs than the other features. Otherwise the larger weight will result in a degradation in classification accuracy. Decorrelation of input features may therefore improve the classification accuracy of distancebased 77 Table 4.8. The performance of the weighted vote kNN algorithm without feature weights (kNNwv ), with computed feature weights (kNNwv FWMI ), or learned feature weights (kNNwv FW V SM ). Domain kNNwv kNNwv FWMI kNNwv
AND
"... Using two definitions of the conditional empirical processes we obtain some approximations for these processes. We also prove the functional law of the iterated logarithm for the conditional processes. Our results say that the asymptotic behavior of the conditional and unconditional empirical proces ..."
Abstract
 Add to MetaCart
(Show Context)
Using two definitions of the conditional empirical processes we obtain some approximations for these processes. We also prove the functional law of the iterated logarithm for the conditional processes. Our results say that the asymptotic behavior of the conditional and unconditional empirical processes are very similar. 0 1988 Academic Press, Inc. 1.
Investigating the Use of NearestNeighbor Interpolation for Cancer Research
, 1997
"... We investigate in how far interpolation mechanisms based on the nearestneighbor rule (NNR) can support cancer research. The main objective is to use the NNR to predict the likelihood of tumorigenesis based on given risk factors. By using a genetic algorithm to optimize the parameters of the nearest ..."
Abstract
 Add to MetaCart
(Show Context)
We investigate in how far interpolation mechanisms based on the nearestneighbor rule (NNR) can support cancer research. The main objective is to use the NNR to predict the likelihood of tumorigenesis based on given risk factors. By using a genetic algorithm to optimize the parameters of the nearestneighbor prediction, the performance of this interpolation method can be improved substantially. Furthermore, it is possible to detect risk factors which are hardly or not relevant to tumorigenesis. Our preliminary studies demonstrate that NNRbased interpolation is a simple tool that nevertheless has enough potential to be seriously considered for cancer research or related research. 1 Introduction In spite of tremendous progress in evaluating the underlying parameters of bioscientific mechanisms still a lot of work remains to be done to understand the complexity of the functional background. In this context, one of the biggest challenges in biological research is to find the key for tumo...
SMOOTHING SPATIAL DATA BY ESTIMATINGr, MEAN LOCAL VARIANCE I
"... Approved for public release; distribution is unlimited. ..."