## A REMEDY TO REGRESSION ESTIMATORS AND NONPARAMETRIC MINIMAX EFFICIENCY (1990)

Citations: | 3 - 3 self |

### BibTeX

@MISC{Fan90aremedy,

author = {Jianqing Fan},

title = {A REMEDY TO REGRESSION ESTIMATORS AND NONPARAMETRIC MINIMAX EFFICIENCY},

year = {1990}

}

### OpenURL

### Abstract

It is known that both Watson-Nadaraya and Gasser-Muller types of regression estimators have some disadvantages. A smooth version of local polynomial regression estimators are proposed to remedy the disadvantages. The mean squared error and mean integrated squared errors are computed explicitly. It turns out that by suitably selecting a kernel and a bandwidth, the proposed estimator has at least asymptotic minimax efficiency 89.6%-proposed estimator is efficient in rates and nearly efficient in constant factors! In nonparametric regression context, the asymptotic minimax lower bound is developed via the heuristic of the "hardest 1-dimensional subproblem". The explicit connections of minimax risks with modulus of continuity are made. Normal submodels are used to avoid the technical difficulty of Le Cam's theory of convergence of experiments. The lower bound is applicable for estimating conditional mean (regression) and conditional quantiles (including median) for both fixed design and random design regression problems.