Results 1  10
of
119
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 455 (52 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Datadriven bandwidth selection in local polynomial fitting: variable bandwidth and spatial adaption
, 1993
"... ..."
Local Regression: Automatic Kernel Carpentry
 Statistical Science
, 1993
"... . A kernel smoother is an intuitive estimate of a regression function or conditional expectation; at each point x 0 the estimate of E(Y j x 0 ) is a weighted mean of the sample Y i , with observations close to x 0 receiving the largest weights. Unfortunately this simplicity has flaws. At the boundar ..."
Abstract

Cited by 108 (2 self)
 Add to MetaCart
. A kernel smoother is an intuitive estimate of a regression function or conditional expectation; at each point x 0 the estimate of E(Y j x 0 ) is a weighted mean of the sample Y i , with observations close to x 0 receiving the largest weights. Unfortunately this simplicity has flaws. At the boundary of the predictor space, the kernel neighborhood is asymmetric and the estimate may have substantial bias. Bias can be a problem in the interior as well if the predictors are nonuniform or if the regression function has substantial curvature. These problems are particularly severe when the predictors are multidimensional. A variety of kernel modifications have been proposed to provide approximate and asymptotic adjustment for these biases. Such methods generally place substantial restrictions on the regression problems that can be considered; in unfavorable situations, they can perform very poorly. Moreover, the necessary modifications are very difficult to implement in the multidimensional...
Smoothing by Local Regression: Principles and Methods
"... this paper we describe two adaptive procedures, one based on C p and the other based on crossvalidation. Still, when we have a final adaptive fit in hand, it is critical to subject it to graphical diagnostics to study its performance. The important implication of these statements is that the above c ..."
Abstract

Cited by 90 (1 self)
 Add to MetaCart
this paper we describe two adaptive procedures, one based on C p and the other based on crossvalidation. Still, when we have a final adaptive fit in hand, it is critical to subject it to graphical diagnostics to study its performance. The important implication of these statements is that the above choices must be tailored to each data set in practice; that is, the choices represent a modeling of the data. It is widely accepted that in global parametric regression there are a variety of choices that must be made  for example, the parametric family to be fitted and the form of the distribution of the response  and that we must rely on our knowledge of the mechanism generating the data, on model selection diagnostics, and on graphical diagnostic methods to make the choices. The same is true for smoothing. Cleveland (1993) presents many examples of this modeling process. For example, in one application, oxides of nitrogen from an automobile engine are fitted to the equivalence ratio, E, of the fuel and the compression ratio, C, of the engine. Coplots show that it is reasonable to use quadratics as the local parametric family but with the added assumption that given E the fitted f
Generalized Likelihood Ratio Statistics And Wilks Phenomenon
, 2000
"... this paper. We introduce the generalized likelihood statistics to overcome the drawbacks of nonparametric maximum likelihood ratio statistics. New Wilks phenomenon is unveiled. We demonstrate that a class of the generalized likelihood statistics based on some appropriate nonparametric estimators are ..."
Abstract

Cited by 78 (20 self)
 Add to MetaCart
this paper. We introduce the generalized likelihood statistics to overcome the drawbacks of nonparametric maximum likelihood ratio statistics. New Wilks phenomenon is unveiled. We demonstrate that a class of the generalized likelihood statistics based on some appropriate nonparametric estimators are asymptotically distribution free and follow
Efficient Estimation of Conditional Variance Functions in Stochastic Regression
 Biometrika
, 1998
"... this paper is to derive an ecient fullyadaptive procedure for estimating ..."
Abstract

Cited by 77 (7 self)
 Add to MetaCart
this paper is to derive an ecient fullyadaptive procedure for estimating
Multivariate Local Polynomial Regression For Time Series: Uniform Strong Consistency And Rates
 J. Time Ser. Anal
, 1996
"... Local highorder polynomial fitting is employed for the estimation of the multivariate regression function m (x 1 , . . . , x d ) = E [y (Y d )  X 1 = x 1 , . . . , X d = x d ], and of its partial derivatives, for stationary random processes {Y i , X i }. The function y may be selected to yield est ..."
Abstract

Cited by 65 (2 self)
 Add to MetaCart
Local highorder polynomial fitting is employed for the estimation of the multivariate regression function m (x 1 , . . . , x d ) = E [y (Y d )  X 1 = x 1 , . . . , X d = x d ], and of its partial derivatives, for stationary random processes {Y i , X i }. The function y may be selected to yield estimates of the conditional mean, conditional moments and conditional distributions. Uniform strong consistency over compact subsets of R d , along with rates, are established for the regression function and its partial derivatives for strongly mixing processes. Short Title: Multivariate Regression Estimation. Key Words: Multivariate regression estimation, local polynomial fitting, mixing processes, uniform strong consistency, rates of convergence. AMS (1991) Subject Classification: 62G07, 62H12, 62M09. ################## This work was supported by the Office of Naval Research under Grant N0001490J1175.  2  1. Introduction Let {Y i , X i } i = be jointly stationary processes on...
Local polynomial kernel regression for generalized linear models and quasilikelihood functions
 Journal of the American Statistical Association,90
, 1995
"... were introduced as a means of extending the techniques of ordinary parametric regression to several commonlyused regression models arising from nonnormal likelihoods. Typically these models have a variance that depends on the mean function. However, in many cases the likelihood is unknown, but the ..."
Abstract

Cited by 57 (7 self)
 Add to MetaCart
were introduced as a means of extending the techniques of ordinary parametric regression to several commonlyused regression models arising from nonnormal likelihoods. Typically these models have a variance that depends on the mean function. However, in many cases the likelihood is unknown, but the relationship between mean and variance can be specified. This has led to the consideration of quasilikelihood methods, where the conditionalloglikelihood is replaced by a quasilikelihood function. In this article we investigate the extension of the nonparametric regression technique of local polynomial fitting with a kernel weight to these more general contexts. In the ordinary regression case local polynomial fitting has been seen to possess several appealing features in terms of intuitive and mathematical simplicity. One noteworthy feature is the better performance near the boundaries compared to the traditional kernel regression estimators. These properties are shown to carryover to the generalized linear model and quasilikelihood model. The end result is a class of kernel type estimators for smoothing in quasilikelihood models. These estimators can be viewed as a straightforward generalization of the usual parametric estimators. In addition, their simple asymptotic distributions allow for simple interpretation
Functionalcoefficient Regression Models for Nonlinear Time Series
 Journal of the American Statistical Association
, 1998
"... We apply the local linear regression technique for estimation of functionalcoefficient regression models for time series data. The models include threshold autoregressive models (Tong 1990) and functionalcoefficient autoregressive models (Chen and Tsay 1993) as special cases but with the added adv ..."
Abstract

Cited by 42 (10 self)
 Add to MetaCart
We apply the local linear regression technique for estimation of functionalcoefficient regression models for time series data. The models include threshold autoregressive models (Tong 1990) and functionalcoefficient autoregressive models (Chen and Tsay 1993) as special cases but with the added advantages such as depicting finer structure of the underlying dynamics and better postsample forecasting performance. We have also proposed a new bootstrap test for the goodness of fit of models and a bandwidth selector based on newly defined crossvalidatory estimation for the expected forecasting errors. The proposed methodology is dataanalytic and is of appreciable flexibility to analyze complex and multivariate nonlinear structures without suffering from the "curse of dimensionality". The asymptotic properties of the proposed estimators are investigated under the ffmixing condition. Both simulated and real data examples are used for illustration. Key Words: ffmixing; Asymptotic normali...