Results 1 - 10
of
33
The existence and asymptotic properties of a backfitting projection algorithm under weak conditions
- Annals of Statistics
, 1999
"... ..."
2004) “Nonparametric Estimation of An Additive Model with A Link Function”, Annals of Statistics
"... This paper describes an estimator of the additive components of a nonparametric additive model with a known link function. When the additive components are twice continuously differentiable, the estimator is asymptotically normally distributed with a rate of convergence in probability of n −2/5. Thi ..."
Abstract
-
Cited by 36 (9 self)
- Add to MetaCart
This paper describes an estimator of the additive components of a nonparametric additive model with a known link function. When the additive components are twice continuously differentiable, the estimator is asymptotically normally distributed with a rate of convergence in probability of n −2/5. This is true regardless of the (finite) dimension of the explanatory variable. Thus, in contrast to the existing asymptotically normal estimator, the new estimator has no curse of dimensionality. Moreover, the estimator has an oracle property. The asymptotic distribution of each additive component is the same as it would be if the other components were known with certainty.
Bandwidth selection for smooth backfitting in additive models
- Annals of Statistics
, 2005
"... The smooth backfitting introduced by Mammen, Linton and Nielsen [Ann. Statist. 27 (1999) 1443–1490] is a promising technique to fit additive regression models and is known to achieve the oracle efficiency bound. In this paper, we propose and discuss three fully automated bandwidth selection methods ..."
Abstract
-
Cited by 17 (9 self)
- Add to MetaCart
(Show Context)
The smooth backfitting introduced by Mammen, Linton and Nielsen [Ann. Statist. 27 (1999) 1443–1490] is a promising technique to fit additive regression models and is known to achieve the oracle efficiency bound. In this paper, we propose and discuss three fully automated bandwidth selection methods for smooth backfitting in additive models. The first one is a penalized least squares approach which is based on higher-order stochastic expansions for the residual sums of squares of the smooth backfitting estimates. The other two are plug-in bandwidth selectors which rely on approximations of the average squared errors and whose utility is restricted to local linear fitting. The large sample properties of these bandwidth selection methods are given. Their finite sample properties are also compared through simulation experiments. 1. Introduction. Nonparametric
Rate-Optimal Estimation for a General Class of Nonparametric Regression Models with Unknown Link Functions
- ANNALS OF STATISTICS
, 2007
"... This paper discusses a nonparametric regression model that naturally generalizes neural network models. The model is based on a finite number of one-dimensional transformations and can be estimated with a one-dimensional rate of convergence. The model contains the generalized additive model with unk ..."
Abstract
-
Cited by 13 (2 self)
- Add to MetaCart
(Show Context)
This paper discusses a nonparametric regression model that naturally generalizes neural network models. The model is based on a finite number of one-dimensional transformations and can be estimated with a one-dimensional rate of convergence. The model contains the generalized additive model with unknown link function as a special case. For this case, it is shown that the additive components and link function can be estimated with the optimal rate by a smoothing spline that is the solution of a penalized least squares criterion.
A Root-N Consistent Backfitting Estimator for Semiparametric Additive Modelling
, 1999
"... We explore additive models that combine both parametric and nonparametric terms and propose a p n-consistent backfitting estimator for the parametric component of the model. The theoretical properties of the estimator are developed for the case with a single nonparametric term and extended to an a ..."
Abstract
-
Cited by 13 (0 self)
- Add to MetaCart
We explore additive models that combine both parametric and nonparametric terms and propose a p n-consistent backfitting estimator for the parametric component of the model. The theoretical properties of the estimator are developed for the case with a single nonparametric term and extended to an arbitrary number of nonparametric additive terms. An estimator for the optimal bandwidth making minimal use of asymptotic expressions for bias and variance is proposed, and a fast implementation algorithm for model fitting and bandwidth selection is developed. The practical behavior of the estimator and bandwidth selection is illustrated by simulation experiments. Key Words: local polynomial regression, bandwidth selection, EBBS, partially linear model. 1 Introduction Additive models are a popular and flexible class of nonparametric regression methods (Hastie and Tibshirani (1990)), which assume that the conditional mean function can be represented as E(Y jZ 1 ; : : : ; ZD ) = m 1 (Z 1 ) + ...
Additive isotone regression
- In: Asymptotics: Particles, Processes and Inverse Problems: Festschrift for Piet Groeneboom. IMS
, 2007
"... This paper is dedicated to Piet Groeneboom on the occasion of his 65th birthday Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimat ..."
Abstract
-
Cited by 11 (0 self)
- Add to MetaCart
(Show Context)
This paper is dedicated to Piet Groeneboom on the occasion of his 65th birthday Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimated as well as it could be by a least squares estimator if the other components were known. The algorithm for the calculation of the estimator uses backfitting. Convergence of the algorithm is shown. Finite sample properties are also compared through simulation experiments. 1.
Smooth backfitting in generalized additive models
, 2007
"... Generalized additive models have been popular among statisticians and data analysts in multivariate nonparametric regression with non-Gaussian responses including binary and count data. In this paper, a new likelihood approach for fitting generalized additive models is proposed. It aims to maximize ..."
Abstract
-
Cited by 9 (6 self)
- Add to MetaCart
(Show Context)
Generalized additive models have been popular among statisticians and data analysts in multivariate nonparametric regression with non-Gaussian responses including binary and count data. In this paper, a new likelihood approach for fitting generalized additive models is proposed. It aims to maximize a smoothed likelihood. The additive functions are estimated by solving a system of nonlinear integral equations. An iterative algorithm based on smooth backfitting is developed from the Newton–Kantorovich theorem. Asymptotic properties of the estimator and convergence of the algorithm are discussed. It is shown that our proposal based on local linear fit achieves the same bias and variance as the oracle estimator that uses knowledge of the other components. Numerical comparison with the recently proposed two-stage estimator [Ann. Statist. 32 (2004) 2412–2443] is also made.
A SIMPLE SMOOTH BACKFITTING METHOD FOR ADDITIVE MODELS
, 2007
"... In this paper a new smooth backfitting estimate is proposed for additive regression models. The estimate has the simple structure of Nadaraya–Watson smooth backfitting but at the same time achieves the oracle property of local linear smooth backfitting. Each component is estimated with the same asym ..."
Abstract
-
Cited by 8 (5 self)
- Add to MetaCart
(Show Context)
In this paper a new smooth backfitting estimate is proposed for additive regression models. The estimate has the simple structure of Nadaraya–Watson smooth backfitting but at the same time achieves the oracle property of local linear smooth backfitting. Each component is estimated with the same asymptotic accuracy as if the other components were known. 1. Introduction. In