Results 1  10
of
29
Ideal spatial adaptation by wavelet shrinkage
 Biometrika
, 1994
"... With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle o ers dramatic ad ..."
Abstract

Cited by 838 (4 self)
 Add to MetaCart
With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle o ers dramatic advantages over traditional linear estimation by nonadaptive kernels � however, it is a priori unclear whether such performance can be obtained by a procedure relying on the data alone. We describe a new principle for spatiallyadaptive estimation: selective wavelet reconstruction. Weshowthatvariableknot spline ts and piecewisepolynomial ts, when equipped with an oracle to select the knots, are not dramatically more powerful than selective wavelet reconstruction with an oracle. We develop a practical spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coe cients. RiskShrink mimics the performance of an oracle for selective wavelet reconstruction as well as it is possible to do so. A new inequality inmultivariate normal decision theory which wecallthe oracle inequality shows that attained performance di ers from ideal performance by at most a factor 2logn, where n is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variableknot spline methods equipped with an oracle. In contrast, it is unknown how or if piecewise polynomial methods could be made to function this well when denied access to an oracle and forced to rely on data alone.
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 448 (52 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Datadriven bandwidth selection in local polynomial fitting: variable bandwidth and spatial adaption
, 1993
"... ..."
Optimal pointwise adaptive methods in nonparametric estimation
 ANN. STATIST
, 1997
"... The problem of optimal adaptive estimation of a function at a given point from noisy data is considered. Two procedures are proved to be asymptotically optimal for different settings. First we study the problem of bandwidth selection for nonparametric pointwise kernel estimation with a given kernel. ..."
Abstract

Cited by 38 (9 self)
 Add to MetaCart
The problem of optimal adaptive estimation of a function at a given point from noisy data is considered. Two procedures are proved to be asymptotically optimal for different settings. First we study the problem of bandwidth selection for nonparametric pointwise kernel estimation with a given kernel. We propose a bandwidth selection procedure and prove its optimality in the asymptotic sense. Moreover, this optimality is stated not only among kernel estimators with a variable bandwidth. The resulting estimator is asymptotically optimal among all feasible estimators. The important feature of this procedure is that it is fully adaptive and it “works” for a very wide class of functions obeying a mild regularity restriction. With it the attainable accuracy of estimation depends on the function itself and is expressed in terms of the “ideal adaptive bandwidth” corresponding to this function and a given kernel. The second procedure can be considered as a specialization of the first one under the qualitative assumption that the function to be estimated belongs to some Hölder class ��β � L � with unknown parameters β � L. This assumption allows us to choose a family of kernels in an optimal way and the resulting procedure appears to be asymptotically optimal in the adaptive sense in any range of adaptation with β ≤ 2.
Local Maximum Likelihood Estimation and Inference
 J. Royal Statist. Soc. B
, 1998
"... Local maximum likelihood estimation is a nonparametric counterpart of the widelyused parametric maximum likelihood technique. It extends the scope of the parametric maximum likelihood method to a much wider class of parametric spaces. Associated with this nonparametric estimation scheme is the issu ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Local maximum likelihood estimation is a nonparametric counterpart of the widelyused parametric maximum likelihood technique. It extends the scope of the parametric maximum likelihood method to a much wider class of parametric spaces. Associated with this nonparametric estimation scheme is the issue of bandwidth selection and bias and variance assessment. This article provides a unified approach to selecting a bandwidth and constructing con dence intervals in local maximum likelihood estimation. The approach is then applied to leastsquares nonparametric regression and to nonparametric logistic regression. Our experiences in these two settings show that the general idea outlined here is powerful and encouraging.
Time Inhomogeneous Multiple Volatility Modelling
, 2001
"... Price variations observed at speculative markets exhibit positive autocorrelation and cross correlation among a set of assets, stock market indices, exchange rates etc. A particular problem in investigating multivariate volatility processes arises from the high dimensionality implied by a simultaneo ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Price variations observed at speculative markets exhibit positive autocorrelation and cross correlation among a set of assets, stock market indices, exchange rates etc. A particular problem in investigating multivariate volatility processes arises from the high dimensionality implied by a simultaneous analysis of variances and covariances. Parametric volatility models as e.g. the multivariate version of the prominent GARCH model become easily intractable for empirical work. We propose an adaptive procedure that aims to identify periods of second order homogeneity for each moment in time. Similar to principal component analysis the dimensionality problem is solved by transforming a multivariate series into a set of univariate processes. We discuss thoroughly implementation issues which naturally arise in the framework of adaptive modelling. Theoretical and Monte Carlo results are given. The empirical performance of the new method is illustrated by an application to a bivariate exchange rate series and a 23dimensional system of asset returns. Empirical results of the FXanalysis are compared to a parametric approach, namely the multivariate GARCH model.
A Study of Variable Bandwidth Selection for Local Polynomial Regression
 Statistica Sinica
, 1996
"... A decisive question in nonparametric smoothing techniques is the choice of the bandwidth or smoothing parameter. The present paper addresses this question when using local polynomial approximations for estimating the regression function and its derivatives. A fullyautomatic bandwidth selection proc ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
A decisive question in nonparametric smoothing techniques is the choice of the bandwidth or smoothing parameter. The present paper addresses this question when using local polynomial approximations for estimating the regression function and its derivatives. A fullyautomatic bandwidth selection procedure has been proposed by Fan and Gijbels (1995), and the empirical performance of it was tested in detail via a variety of examples. Those experiences supported the methodology towards a great extend. In this paper we establish asymptotic results for the proposed variable bandwidth selector. We provide the rate of convergence of the bandwidth estimate, and obtain the asymptotic distribution of its error relative to the theoretical optimal variable bandwidth. Those asymptotic properties give extra support to the developed bandwidth selection procedure. It is also demonstrated how the proposed selection method can be applied in the density estimation setup. Some examples illustrate this ap...
Dynamics of Implied Volatility Surfaces.
, 2001
"... The prices of index options at a given date are usually represented via the corresponding implied volatility surface, presenting skew/smile features and term structure which several models have attempted to reproduce. ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
The prices of index options at a given date are usually represented via the corresponding implied volatility surface, presenting skew/smile features and term structure which several models have attempted to reproduce.
On Local Smoothing Of Nonparametric Curve Estimators
 Journal of the American Statistical Association
, 1993
"... . We begin by analyzing the local adaptation properties of waveletbased curve estimators. It is argued that while wavelet methods enjoy outstanding adaptability in terms of the manner in which they capture irregular episodes in a curve, they are not nearly as adaptive when considered from the viewp ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
. We begin by analyzing the local adaptation properties of waveletbased curve estimators. It is argued that while wavelet methods enjoy outstanding adaptability in terms of the manner in which they capture irregular episodes in a curve, they are not nearly as adaptive when considered from the viewpoint of tracking more subtle changes in a smooth function. We point out that while this problem may be remedied by modifying wavelet estimators, simple modifications are typically not sufficient to properly achieve adaptive smoothing of a relatively highly differentiable function. In that case, local changes to the primary level of resolution of the wavelet transform are required. While such an approach is feasible, it is not an attractive proposition on either practical or aesthetic grounds. Motivated by this difficulty, we develop local versions of familiar smoothing methods, such as crossvalidation and smoothed crossvalidation, in the contexts of density estimation and regression. It is...
Variable bandwidth and Onestep Local MEstimator
 Science in China, Series A
, 1997
"... We study a robust version of local linear regression smoothers augmented with variable bandwidth. The proposed method inherits the advantages of local polynomial regression and overcomes lack of robustness of leastsquares techniques. The use of variable bandwidth enhances the flexibility of the res ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
We study a robust version of local linear regression smoothers augmented with variable bandwidth. The proposed method inherits the advantages of local polynomial regression and overcomes lack of robustness of leastsquares techniques. The use of variable bandwidth enhances the flexibility of the resulting local Mestimators and makes them possible to cope well with spatially inhomogeneous curves, heteroscedastic errors and nonuniform design densities. Under appropriate regularity conditions, it is shown that the proposed estimators exist and are asymptotically normal. Based on the robust estimation equation, we introduce onestep local Mestimators to reduce computational burden. It is demonstrated that the onestep local Mestimators share the same asymptotic distributions as the fully iterative Mestimators, as long as the initial estimators are good enough. In other words, the onestep local Mestimators reduce significantly the computation cost of the fully iterative Mestimators wi...