Results 1 
6 of
6
New Support Vector Algorithms
, 2000
"... this article with the regression case. To explain this, we will introduce a suitable definition of a margin that is maximized in both cases ..."
Abstract

Cited by 322 (45 self)
 Add to MetaCart
this article with the regression case. To explain this, we will introduce a suitable definition of a margin that is maximized in both cases
Variance Estimation in Nonparametric Regression via the Difference Sequence Method
 Ann. Statist
, 2006
"... Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of differencebased kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwid ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of differencebased kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for suitable asymptotic formulations our estimators achieve the minimax rate.
Estimating residual variance in nonparametric regression using least squares, Biometrika 92: 821–830
, 2005
"... We propose a new estimator for the error variance in a nonparametric regression model. We estimate the error variance as the intercept in a simple linear regression model with squared differences of paired observations as the dependent variable and squared distances between the paired covariates as ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
We propose a new estimator for the error variance in a nonparametric regression model. We estimate the error variance as the intercept in a simple linear regression model with squared differences of paired observations as the dependent variable and squared distances between the paired covariates as the regressor. Our method can be applied to nonparametric regression models with multivariate functions defined on arbitrary subsets of normed spaces, possibly observed on unequally spaced or clustered designed points. No ordering is required for our method. We develop methods for selecting the bandwidth. For the special case of one dimensional domain with equally spaced design points, we show that our method reaches an asymptotic optimal rate which is not achieved by some existing methods. We conduct extensive simulations to evaluate finite sample performance of our method and compare it with existing methods. We illustrate our method using a real data set.
Testing For Monotonicity Of A Regression Mean Without Selecting A Bandwidth
, 1998
"... . A new approach to testing for monotonicity of a regression mean, not requiring computation of a curve estimator or a bandwidth, is suggested. It is based on the notion of `running gradients' over short intervals, although from some viewpoints it may be regarded as an analogue for monotonicity test ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
. A new approach to testing for monotonicity of a regression mean, not requiring computation of a curve estimator or a bandwidth, is suggested. It is based on the notion of `running gradients' over short intervals, although from some viewpoints it may be regarded as an analogue for monotonicity testing of the dip/excess mass approach for testing modality hypotheses about densities. Like the latter methods, the new technique does not suffer difficulties caused by almostflat parts of the target function. In fact, it is calibrated so as to work well for flat response curves, and as a result it has relatively good power properties in boundary cases where the curve exhibits shoulders. In this respect, as well as in its construction, the `running gradients' approach differs from alternative techniques based on the notion of a critical bandwidth. KEYWORDS. Bootstrap, calibration, curve estimation, Monte Carlo, response curve, running gradient. SHORT TITLE. Testing for monotonicity. 1 The man...
Construction of Automatic Confidence Intervals in Nonparametric Heteroscedastic Regression by a MomentOriented Bootstrap.
, 1997
"... We construct pointwise confidence intervals for regression functions. The method uses nonparametric kernel estimates and the "momentoriented" bootstrap method of Bunke which is a wild bootstrap based on smoothed local estimators of higher order error moments. We show that our bootstrap consistently ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We construct pointwise confidence intervals for regression functions. The method uses nonparametric kernel estimates and the "momentoriented" bootstrap method of Bunke which is a wild bootstrap based on smoothed local estimators of higher order error moments. We show that our bootstrap consistently estimates the distribution of mh(x0) \Gamma m(x0 ). In the present paper we focus on fully datadriven procedures and prove that the confidence intervals give asymptotically correct coverage probabilities. 1 Introduction We consider the nonparametric regression model Y i = m(x i ) + ffl i ; 1 i n; (1.1) where the errors ffl i are independent, but not necessarily identically distributed random variables with zero mean and finite central moments ¯ 2 (x i ), ¯ 3 (x i ) and ¯ 4 (x i ). The nonrandom design points x 1 ! \Delta \Delta \Delta ! x n are assumed to be equally spaced on the unit interval [0; 1]. We aim at defining a confidence interval for the value m(x 0 ) of the regression fun...
Wild Bootstrap Versus MomentOriented Bootstrap.
, 1997
"... We investigate the relative merits of a "momentoriented" bootstrap method of Bunke (1997) in comparison with the classical wild bootstrap of Wu (1986) in nonparametric heteroscedastic regression situations. The "momentoriented" bootstrap is a wild bootstrap based on local estimators of higher orde ..."
Abstract
 Add to MetaCart
We investigate the relative merits of a "momentoriented" bootstrap method of Bunke (1997) in comparison with the classical wild bootstrap of Wu (1986) in nonparametric heteroscedastic regression situations. The "momentoriented" bootstrap is a wild bootstrap based on local estimators of higher order error moments that are smoothed by kernel smoothers. In this paper we perform an asymptotic comparison of these two different bootstrap procedures. We show that the momentoriented bootstrap is in no case worse than the wild bootstrap. We consider the cases of bandwidths with MISEoptimal rates and of bandwidths with rates that perform an optimal bootstrap approximation. When the regression function has the same amount of smoothness as the second and the third order error moment, then it turns out that, in the former case, our method better approximates the distribution of the pivotal statistic than the usual wild bootstrap does. The reason for this behavior is the unavoidable bias in nonp...