Results 1  10
of
17
F: Functional additive models
 J Am Stat Assoc
"... In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
(Show Context)
In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption and propose to replace it by an additive structure. This leads to a more widely applicable and much more flexible framework for functional regression models. The proposed functional additive regression models are suitable for both scalar and functional responses. The regularization needed for effective estimation of the regression parameter function is implemented through a projection on the eigenbasis of the covariance operator of the functional components in the model. The utilization of functional principal components in an additive rather than linear way leads to substantial broadening of the scope of functional regression models and emerges as a natural approach, as the uncorrelatedness of the functional principal components is shown to lead to a straightforward implementation of the functional additive model, just based on a sequence of onedimensional smoothing steps and without need for backfitting. This facilitates the theoretical analysis, and we establish asymptotic
RateOptimal Estimation for a General Class of Nonparametric Regression Models with Unknown Link Functions
 ANNALS OF STATISTICS
, 2007
"... This paper discusses a nonparametric regression model that naturally generalizes neural network models. The model is based on a finite number of onedimensional transformations and can be estimated with a onedimensional rate of convergence. The model contains the generalized additive model with unk ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
This paper discusses a nonparametric regression model that naturally generalizes neural network models. The model is based on a finite number of onedimensional transformations and can be estimated with a onedimensional rate of convergence. The model contains the generalized additive model with unknown link function as a special case. For this case, it is shown that the additive components and link function can be estimated with the optimal rate by a smoothing spline that is the solution of a penalized least squares criterion.
Additive isotone regression
 In: Asymptotics: Particles, Processes and Inverse Problems: Festschrift for Piet Groeneboom. IMS
, 2007
"... This paper is dedicated to Piet Groeneboom on the occasion of his 65th birthday Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimat ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
This paper is dedicated to Piet Groeneboom on the occasion of his 65th birthday Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimated as well as it could be by a least squares estimator if the other components were known. The algorithm for the calculation of the estimator uses backfitting. Convergence of the algorithm is shown. Finite sample properties are also compared through simulation experiments. 1.
Smooth backfitting in generalized additive models
, 2007
"... Generalized additive models have been popular among statisticians and data analysts in multivariate nonparametric regression with nonGaussian responses including binary and count data. In this paper, a new likelihood approach for fitting generalized additive models is proposed. It aims to maximize ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
Generalized additive models have been popular among statisticians and data analysts in multivariate nonparametric regression with nonGaussian responses including binary and count data. In this paper, a new likelihood approach for fitting generalized additive models is proposed. It aims to maximize a smoothed likelihood. The additive functions are estimated by solving a system of nonlinear integral equations. An iterative algorithm based on smooth backfitting is developed from the Newton–Kantorovich theorem. Asymptotic properties of the estimator and convergence of the algorithm are discussed. It is shown that our proposal based on local linear fit achieves the same bias and variance as the oracle estimator that uses knowledge of the other components. Numerical comparison with the recently proposed twostage estimator [Ann. Statist. 32 (2004) 2412–2443] is also made.
A SIMPLE SMOOTH BACKFITTING METHOD FOR ADDITIVE MODELS
, 2007
"... In this paper a new smooth backfitting estimate is proposed for additive regression models. The estimate has the simple structure of Nadaraya–Watson smooth backfitting but at the same time achieves the oracle property of local linear smooth backfitting. Each component is estimated with the same asym ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
(Show Context)
In this paper a new smooth backfitting estimate is proposed for additive regression models. The estimate has the simple structure of Nadaraya–Watson smooth backfitting but at the same time achieves the oracle property of local linear smooth backfitting. Each component is estimated with the same asymptotic accuracy as if the other components were known. 1. Introduction. In
Continuously additive models for nonlinear functional regression
 Biometrika
, 2012
"... We introduce continuously additive models, which can be motivated as extensions of additive regression models with vector predictors to the case of infinitedimensional predictors. This approach provides a class of flexible functional nonlinear regression models, where random predictor curves are c ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
We introduce continuously additive models, which can be motivated as extensions of additive regression models with vector predictors to the case of infinitedimensional predictors. This approach provides a class of flexible functional nonlinear regression models, where random predictor curves are coupled with scalar responses. In continuously additive modeling, integrals taken over a smooth surface along graphs of predictor functions relate the predictors to the responses in a nonlinear fashion. We use tensor product basis expansions to fit the smooth regression surface that characterizes the model. In a theoretical investigation, we show that the predictions obtained from fitting continuously additive models are consistent and asymptotically normal. We also consider extensions to generalized responses. The proposed approach outperforms existing functional regression models in simulations and data illustrations.
Nonparametric Models in Binary Choice Fixed Effects Panel Data
"... In this paper we extend the fixed effects approach to deal with endogeneity arising from persistent unobserved heterogeneity to nonlinear panel data with nonparametric components. Specifically, we propose a nonparametric procedure that generalizes Chamberlain’s (1984) conditional logit approach. We ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In this paper we extend the fixed effects approach to deal with endogeneity arising from persistent unobserved heterogeneity to nonlinear panel data with nonparametric components. Specifically, we propose a nonparametric procedure that generalizes Chamberlain’s (1984) conditional logit approach. We develop an estimator based on nonlinear stochastic integral equations and provide the asymptotic property of the estimator and an iterative algorithm to implement the estimator. We analyze the small sample behavior of the estimator through a Monte Carlo study, and consider the decision to retire as an illustrative application. JEL Classification: C14; C23
Nonparametric additive . . . Measured Data
, 2009
"... We develop an easily computed smooth backfitting algorithm for additive model fitting in repeated measures problems. Our methodology easily copes with various settings, such as when some covariates are the same over repeated response measurements. We allow for a working covariance matrix for the reg ..."
Abstract
 Add to MetaCart
We develop an easily computed smooth backfitting algorithm for additive model fitting in repeated measures problems. Our methodology easily copes with various settings, such as when some covariates are the same over repeated response measurements. We allow for a working covariance matrix for the regression errors, showing that our method is most efficient when the correct covariance matrix is used. The component functions achieve the known asymptotic variance lower bound for the scalar argument case. Smooth backfitting also leads directly to designindependent biases in the local linear case. Simulations show our estimator has smaller variance than the usual kernel estimator. This is also illustrated by an example from nutritional epidemiology.
Printed in Great Britain Continuously additive models for nonlinear functional
"... We introduce continuously additive models, which can be viewed as extensions of additive regression models with vector predictors to the case of infinitedimensional predictors. This approach produces a class of flexible functional nonlinear regression models, where random predictor curves are coupl ..."
Abstract
 Add to MetaCart
We introduce continuously additive models, which can be viewed as extensions of additive regression models with vector predictors to the case of infinitedimensional predictors. This approach produces a class of flexible functional nonlinear regression models, where random predictor curves are coupled with scalar responses. In continuously additive modelling, integrals taken over a smooth surface along graphs of predictor functions relate the predictors to the responses in a nonlinear fashion. We use tensor product basis expansions to fit the smooth regression surface that characterizes the model. In a theoretical investigation, we show that the predictions obtained from fitting continuously additive models are consistent and asymptotically normal. We also consider extensions to generalized responses. The proposed class of models outperforms existing functional regression models in simulations and realdata examples.
A Service of zbw Additive Models: Extensions and Related Models
"... StandardNutzungsbedingungen: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Zwecken und zum Privatgebrauch gespeichert und kopiert werden. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich machen, ..."
Abstract
 Add to MetaCart
(Show Context)
StandardNutzungsbedingungen: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Zwecken und zum Privatgebrauch gespeichert und kopiert werden. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich machen, vertreiben oder anderweitig nutzen. Sofern die Verfasser die Dokumente unter OpenContentLizenzen (insbesondere CCLizenzen) zur Verfügung gestellt haben sollten, gelten abweichend von diesen Nutzungsbedingungen die in der dort genannten Lizenz gewährten Nutzungsrechte. Abstract We give an overview over smooth backfitting type estimators in additive models. Moreover we illustrate their wide applicability in models closely related to additive models such as nonparametric regression with dependent error variables where the errors can be transformed to white noise by a linear transformation, nonparametric regression with repeatedly measured data, nonparametric panels with fixed effects, simultaneous nonparametric equation models, and nonand semiparametric autoregression and GARCHmodels. We also discuss extensions to varying coefficient models, additive models with missing observations, and the case of nonstationary covariates. Terms of use: Documents in EconStor may