Results 1 
7 of
7
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 499 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Scattered Data Interpolation with Multilevel Splines
 IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
, 1997
"... This paper describes a fast algorithm for scattered data interpolation and approximation. Multilevel Bsplines are introduced to compute a C²continuous surface through a set of irregularly spaced points. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequen ..."
Abstract

Cited by 117 (10 self)
 Add to MetaCart
This paper describes a fast algorithm for scattered data interpolation and approximation. Multilevel Bsplines are introduced to compute a C²continuous surface through a set of irregularly spaced points. The algorithm makes use of a coarsetofine hierarchy of control lattices to generate a sequence of bicubic Bspline functions whose sum approaches the desired interpolation function. Large performance gains are realized by using Bspline refinement to reduce the sum of these functions into one equivalent Bspline function. Experimental results demonstrate that highfidelity reconstruction is possible from a selected set of sparse and irregular samples.
MemoryBased Neural Networks For Robot Learning
 Neurocomputing
, 1995
"... This paper explores a memorybased approach to robot learning, using memorybased neural networks to learn models of the task to be performed. Steinbuch and Taylor presented neural network designs to explicitly store training data and do nearest neighbor lookup in the early 1960s. In this paper their ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
(Show Context)
This paper explores a memorybased approach to robot learning, using memorybased neural networks to learn models of the task to be performed. Steinbuch and Taylor presented neural network designs to explicitly store training data and do nearest neighbor lookup in the early 1960s. In this paper their nearest neighbor network is augmented with a local model network, which fits a local model to a set of nearest neighbors. This network design is equivalent to a statistical approach known as locally weighted regression, in which a local model is formed to answer each query, using a weighted regression in which nearby points (similar experiences) are weighted more than distant points (less relevant experiences). We illustrate this approach by describing how it has been used to enable a robot to learn a difficult juggling task. Keywords: memorybased, robot learning, locally weighted regression, nearest neighbor, local models. 1 Introduction An important problem in motor learning is approxim...
Nonparametric Regression for Learning Nonlinear Transformations
 PRERATIONAL INTELLIGENCE IN STRATEGIES, HIGHLEVEL PROCESSES AND COLLECTIVE BEHAVIOR
"... Information processing in animals and artificial movement systems consists of a series of transformations that map sensory signals to intermediate representations, and finally to motor commands. Given the physical and neuroanatomical differences between individuals and the need for plasticity during ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Information processing in animals and artificial movement systems consists of a series of transformations that map sensory signals to intermediate representations, and finally to motor commands. Given the physical and neuroanatomical differences between individuals and the need for plasticity during development, it is highly likely that such transformations are learned rather than preprogrammed by evolution. Such selforganizing processes, capable of discovering nonlinear dependencies between different groups of signals, are one essential part of prerational intelligence. While neural network algorithms seem to be the natural choice when searching for solutions for learning transformations, this paper will take a more careful look at which types of neural networks are actually suited for the requirements of an autonomous learning system. The approach that we will pursue is guided by recent developments in learning theory that have linked neural network learning to well established statistical theories. In particular, this new statistical understanding has given rise to the development of neural network systems that are directly based on statistical methods. One family of such methods stems from nonparametric regression. This paper will compare nonparametric learning with the more widely used parametric counterparts in a non technical fashion, and investigate how these two families differ in their properties and their applicabilities. We will argue that nonparametric neural networks offer a set of characteristics that make them a very promising candidate for online learning in autonomous system.
Nonparametric regression for learning
 CENTER FOR INTERDISCIPLINARY RESEARCH, UNIVERSITY OF BIELEFELD
, 1994
"... In recent years, learning theory has been increasingly influenced by the fact that many learning algorithms have at least in part a comprehensive interpretation in terms of well established statistical theories. Furthermore, with little modification, several statistical methods can be directly cast ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In recent years, learning theory has been increasingly influenced by the fact that many learning algorithms have at least in part a comprehensive interpretation in terms of well established statistical theories. Furthermore, with little modification, several statistical methods can be directly cast into learning algorithms. One family of such methods stems from nonparametric regression. This paper compares nonparametric learning with the more widely used parametric counterparts and investigates how these two families differ in their properties and their applicability.
Acoustic to Articulatory Mapping using Memory Based Regression and Trajectory Smoothing
"... This paper investigates the memory based regression for the task of acoustictoarticulatory inversion. In memory based regression, a local model is built for each test sample using the k nearest neighbors, and an articulatory vector is estimated. In this paper we investigates different regression ..."
Abstract
 Add to MetaCart
(Show Context)
This paper investigates the memory based regression for the task of acoustictoarticulatory inversion. In memory based regression, a local model is built for each test sample using the k nearest neighbors, and an articulatory vector is estimated. In this paper we investigates different regression models, from the basic Average estimate, to a polynomial estimate. The paper also investigates the importance of optimizing the number of nearest neighbors (k) on the error function. For this purpose, we use a neural network to estimate the number of neighbors used for regression. This method is used to estimate the articulatory values, and their first two derivatives. The values and their derivatives are then used on an utterance level to temporally smooth and estimated articulatory parameters using the Trajectory Likelihood Maximization method. We apply this method on the MOCHA database, which consists of 167000 acousticarticulatory frames. These steps combined, and using 15 fold cross validation on MOCHA, show that memory based regression with optimizing the number of neighbors, followed by trajectory smoothing can decrease the Mean Square Error beyond the stateoftheart methods.
The Local Geoid Model of Cameroon: CGM05
"... Abstract: This paper deals with the geoid determination in Cameroon by a gravimetric solution. A number of data fi les were compiled for this work, containing about 62,000 points on land and ocean areas and also including data derived from satellite altimetry. A hybrid global geopotential model (EGM ..."
Abstract
 Add to MetaCart
Abstract: This paper deals with the geoid determination in Cameroon by a gravimetric solution. A number of data fi les were compiled for this work, containing about 62,000 points on land and ocean areas and also including data derived from satellite altimetry. A hybrid global geopotential model (EGMGGM) supplied the longer wavelength components of this geoid model, CGM05. This global model is obtained by adjusting the GRACE model GGM02C to degree and order 360 using the harmonic coeffi cients of the model EGM96 beyond the maximal degree 200 of GGM02C. The medium wavelength components were computed using the best gridded residual gravity anomalies, by integration in Stokes ’ formula. The digital terrain model GLOBE contributed to its short wavelength components. The residual terrain model (RTM) was applied to fi rst determine a quasigeoid model. This intermediate surface was converted to the geoid using a grid of simple Bouguer gravity anomalies. The validation of CGM05 is based on comparisons to global and regional geoids. A GPS/levelling geometric geoid computed in a small part of the target area shows that the absolute accuracy of this local geoid model is 14 cm. After a fourparameter fi tting to the GPS/levelled reference surface, this absolute accuracy reduced to 11 cm.