• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Bandwidth selection for smooth backfitting in additive models. (2005)

by E Mammen, B U Park
Venue:Ann. Statist.,
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 17
Next 10 →

F: Functional additive models

by Hans-georg Müller - J Am Stat Assoc
"... In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption ..."
Abstract - Cited by 29 (8 self) - Add to MetaCart
In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption and propose to replace it by an additive structure. This leads to a more widely applicable and much more flexible framework for functional regression models. The proposed functional additive regression models are suitable for both scalar and functional responses. The regularization needed for effective estimation of the regression parameter function is implemented through a projection on the eigenbasis of the covariance operator of the functional components in the model. The utilization of functional principal components in an additive rather than linear way leads to substantial broadening of the scope of functional regression models and emerges as a natural approach, as the uncorrelatedness of the functional principal components is shown to lead to a straightforward implementation of the functional additive model, just based on a sequence of one-dimensional smoothing steps and without need for backfitting. This facilitates the theoretical analysis, and we establish asymptotic
(Show Context)

Citation Context

...le implementations of the functional additive model. While complex iterative procedures are required to fit a regular additive model (backfitting and variants, see, e.g., Hastie and Tibshirani, 1990; =-=Mammen and Park, 2005-=-), representations (10), (11) motivate a straightforward estimation scheme to recover the component functions fk, respectively fkm, by a series of one-dimensional smoothing steps. This not only leads ...

Rate-Optimal Estimation for a General Class of Nonparametric Regression Models with Unknown Link Functions

by Joel L. Horowitz, Enno Mammen - ANNALS OF STATISTICS , 2007
"... This paper discusses a nonparametric regression model that naturally generalizes neural network models. The model is based on a finite number of one-dimensional transformations and can be estimated with a one-dimensional rate of convergence. The model contains the generalized additive model with unk ..."
Abstract - Cited by 13 (2 self) - Add to MetaCart
This paper discusses a nonparametric regression model that naturally generalizes neural network models. The model is based on a finite number of one-dimensional transformations and can be estimated with a one-dimensional rate of convergence. The model contains the generalized additive model with unknown link function as a special case. For this case, it is shown that the additive components and link function can be estimated with the optimal rate by a smoothing spline that is the solution of a penalized least squares criterion.
(Show Context)

Citation Context

... a modification of backfitting that works more reliably in the case of many components and irregular design and that allows a complete asymptotic theory. Nielsen and Sperlich [31] and Mammen and Park =-=[24, 25]-=- discuss practical implementation of smooth backfitting. Tjøstheim and Auestad [38], Linton and Nielsen [21] and Fan, Härdle and Mammen [9] discuss marginal integration estimators. See Christopeit and...

Additive isotone regression

by Enno Mammen, Kyusang Yu, Universität Mannheim - In: Asymptotics: Particles, Processes and Inverse Problems: Festschrift for Piet Groeneboom. IMS , 2007
"... This paper is dedicated to Piet Groeneboom on the occasion of his 65th birthday Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimat ..."
Abstract - Cited by 11 (0 self) - Add to MetaCart
This paper is dedicated to Piet Groeneboom on the occasion of his 65th birthday Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimated as well as it could be by a least squares estimator if the other components were known. The algorithm for the calculation of the estimator uses backfitting. Convergence of the algorithm is shown. Finite sample properties are also compared through simulation experiments. 1.
(Show Context)

Citation Context

... [34] and Opsomer [35] for the classical backfitting and in Mammen, Linton and Nielsen [28] for the smooth backfitting. Bandwidth choice and practical implementations are discussed in Mammen and Park =-=[29, 30]-=- and Nielsen and Sperlich [33]. The basic difference between smooth backfitting and backfitting lies in the fact that smooth backfitting is based on a smoothed least squares criterion whereas in the c...

Smooth backfitting in generalized additive models

by Kyusang Yu, Byeong U. Park, Enno Mammen , 2007
"... Generalized additive models have been popular among statisticians and data analysts in multivariate nonparametric regression with non-Gaussian responses including binary and count data. In this paper, a new likelihood approach for fitting generalized additive models is proposed. It aims to maximize ..."
Abstract - Cited by 9 (6 self) - Add to MetaCart
Generalized additive models have been popular among statisticians and data analysts in multivariate nonparametric regression with non-Gaussian responses including binary and count data. In this paper, a new likelihood approach for fitting generalized additive models is proposed. It aims to maximize a smoothed likelihood. The additive functions are estimated by solving a system of nonlinear integral equations. An iterative algorithm based on smooth backfitting is developed from the Newton–Kantorovich theorem. Asymptotic properties of the estimator and convergence of the algorithm are discussed. It is shown that our proposal based on local linear fit achieves the same bias and variance as the oracle estimator that uses knowledge of the other components. Numerical comparison with the recently proposed two-stage estimator [Ann. Statist. 32 (2004) 2412–2443] is also made.
(Show Context)

Citation Context

...ee [10] for a discussion on this. Mammen and Nielsen [17] considered a general class of nonlinear regression and discussed some estimation principles including the smooth backfitting. Mammen and Park =-=[18]-=- proposed several bandwidth selection methods for smooth backfitting, and Mammen and Park [19] provided a simplified version of the local linear smooth backfitting estimator in additive models.4 K. Y...

A SIMPLE SMOOTH BACKFITTING METHOD FOR ADDITIVE MODELS

by Enno Mammen, Byeong U. Park , 2007
"... In this paper a new smooth backfitting estimate is proposed for additive regression models. The estimate has the simple structure of Nadaraya–Watson smooth backfitting but at the same time achieves the oracle property of local linear smooth backfitting. Each component is estimated with the same asym ..."
Abstract - Cited by 8 (5 self) - Add to MetaCart
In this paper a new smooth backfitting estimate is proposed for additive regression models. The estimate has the simple structure of Nadaraya–Watson smooth backfitting but at the same time achieves the oracle property of local linear smooth backfitting. Each component is estimated with the same asymptotic accuracy as if the other components were known. 1. Introduction. In
(Show Context)

Citation Context

...itting also achieves the oracle bounds. This has been shown for smooth backfitting estimates based on local linear fitting (see [9]). For practical implementations of smooth backfitting, see [12] and =-=[11]-=-. SomeA SIMPLE BACKFITTING METHOD 3 two-step procedures have been proposed for additive models. Christopeit and Hoderlein [2] use local quasi-differencing in the second step, an idea coming from effi...

Continuously additive models for nonlinear functional regression

by Hans-georg Müller, Yichao Wu - Biometrika , 2012
"... We introduce continuously additive models, which can be motivated as extensions of ad-ditive regression models with vector predictors to the case of infinite-dimensional predictors. This approach provides a class of flexible functional nonlinear regression models, where random predictor curves are c ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
We introduce continuously additive models, which can be motivated as extensions of ad-ditive regression models with vector predictors to the case of infinite-dimensional predictors. This approach provides a class of flexible functional nonlinear regression models, where random predictor curves are coupled with scalar responses. In continuously additive modeling, integrals taken over a smooth surface along graphs of predictor functions relate the predictors to the responses in a nonlinear fashion. We use tensor product basis expansions to fit the smooth regression surface that characterizes the model. In a theoretical investigation, we show that the predictions obtained from fitting continuously additive models are consistent and asymp-totically normal. We also consider extensions to generalized responses. The proposed approach outperforms existing functional regression models in simulations and data illustrations.
(Show Context)

Citation Context

...9). Additive models (Friedman & Stuetzle 1981; Stone 1985) have been successfully used for many regression situations that involve continuous predictors and both continuous and generalized responses (=-=Mammen & Park 2005-=-; Yu et al. 2008; Carroll et al. 2008). The functional predictors we consider in the proposed time-additive approach to functional regression are assumed to be observed on their entire domain, usually...

Nonparametric Models in Binary Choice Fixed Effects Panel Data

by Stefan Hoderlein, Enno Mammen, Kyusang Yu
"... In this paper we extend the fixed effects approach to deal with endogeneity arising from persistent unobserved heterogeneity to nonlinear panel data with nonparametric components. Specifically, we propose a nonparametric procedure that generalizes Chamberlain’s (1984) conditional logit approach. We ..."
Abstract - Cited by 3 (1 self) - Add to MetaCart
In this paper we extend the fixed effects approach to deal with endogeneity arising from persistent unobserved heterogeneity to nonlinear panel data with nonparametric components. Specifically, we propose a nonparametric procedure that generalizes Chamberlain’s (1984) conditional logit approach. We develop an estimator based on nonlinear stochastic integral equations and provide the asymptotic property of the estimator and an iterative algorithm to implement the estimator. We analyze the small sample behavior of the estimator through a Monte Carlo study, and consider the decision to retire as an illustrative application. JEL Classification: C14; C23

Nonparametric additive . . . Measured Data

by Raymond J. Carroll, Arnam Naity, Enno Mammen, Kyusang Yu , 2009
"... We develop an easily computed smooth backfitting algorithm for additive model fitting in repeated measures problems. Our methodology easily copes with various settings, such as when some covariates are the same over repeated response measurements. We allow for a working covariance matrix for the reg ..."
Abstract - Add to MetaCart
We develop an easily computed smooth backfitting algorithm for additive model fitting in repeated measures problems. Our methodology easily copes with various settings, such as when some covariates are the same over repeated response measurements. We allow for a working covariance matrix for the regression errors, showing that our method is most efficient when the correct covariance matrix is used. The component functions achieve the known asymptotic variance lower bound for the scalar argument case. Smooth backfitting also leads directly to design-independent biases in the local linear case. Simulations show our estimator has smaller variance than the usual kernel estimator. This is also illustrated by an example from nutritional epidemiology.

Printed in Great Britain Continuously additive models for nonlinear functional

by Hans-georg Müller, Yichao Wu
"... We introduce continuously additive models, which can be viewed as extensions of additive regression models with vector predictors to the case of infinite-dimensional predictors. This approach produces a class of flexible functional nonlinear regression models, where random predictor curves are coupl ..."
Abstract - Add to MetaCart
We introduce continuously additive models, which can be viewed as extensions of additive regression models with vector predictors to the case of infinite-dimensional predictors. This approach produces a class of flexible functional nonlinear regression models, where random predictor curves are coupled with scalar responses. In continuously additive modelling, inte-grals taken over a smooth surface along graphs of predictor functions relate the predictors to the responses in a nonlinear fashion. We use tensor product basis expansions to fit the smooth regression surface that characterizes the model. In a theoretical investigation, we show that the predictions obtained from fitting continuously additive models are consistent and asymptotically normal. We also consider extensions to generalized responses. The proposed class of models outperforms existing functional regression models in simulations and real-data examples.

A Service of zbw Additive Models: Extensions and Related Models

by Enno Mammen , Byeong U Park , Melanie Schienle , Enno Mammen , Byeong U Park , Melanie Schienle
"... Standard-Nutzungsbedingungen: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Zwecken und zum Privatgebrauch gespeichert und kopiert werden. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich machen, ..."
Abstract - Add to MetaCart
Standard-Nutzungsbedingungen: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Zwecken und zum Privatgebrauch gespeichert und kopiert werden. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich machen, vertreiben oder anderweitig nutzen. Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, gelten abweichend von diesen Nutzungsbedingungen die in der dort genannten Lizenz gewährten Nutzungsrechte. Abstract We give an overview over smooth backfitting type estimators in additive models. Moreover we illustrate their wide applicability in models closely related to additive models such as nonparametric regression with dependent error variables where the errors can be transformed to white noise by a linear transformation, nonparametric regression with repeatedly measured data, nonparametric panels with fixed effects, simultaneous nonparametric equation models, and non-and semiparametric autoregression and GARCH-models. We also discuss extensions to varying coefficient models, additive models with missing observations, and the case of nonstationary covariates. Terms of use: Documents in EconStor may
(Show Context)

Citation Context

...lowing result holds for the asymptotic distribution of each component function fj(x j), j = 1, . . . , d: √ nhj ( fj(x j)− fj(xj)− βj(xj) ) d−→ N ( 0, ∫ K2(u) du σ2j (x j) pXj (xj) ) . (2.7) Here the asymptotic bias terms βj(x j) are defined as minimizers of (β1, . . . , βd) ∫ [β(x)− β1(x1)− · · · − βd(xd)]2pX(x) dx under the constraint that∫ βj(x j)pXj (x j)dxj = 1 2 h2j ∫ [2f ′j(x j)p′Xj (x j) + f ′′j (x j)pXj (x j)] dxj ∫ u2K(u) du, (2.8) where pX is the joint density of X = (X 1, . . . , Xd) and β(x) = 1 2 d∑ j=1 h2j [ 2f ′j(x j) ∂ log pX ∂xj (x) + f ′′j (x j) ] ∫ u2K(u) du. In [37] and [40] this asymptotic statement has been proved for the case that fj is estimated on a compact interval Ij . The conditions include a boundary modification of the kernel. Specifically, the convolution kernel h−1j K(h −1 j (X j i − xj)) is replaced by Khj (X j i , x j) = h−1j K(h −1 j (X j i − xj))/ ∫ Ij h−1j K(h −1 j (X j i − uj)) duj . Then it holds that ∫ Ij Khj (X j i , x j) dxj = 1. In particular, this implies ∫ Ij pXj ,Xk(x j , xk)dxj = pXk(x k) and ∫ Ij pXj (x j)dxj = 1 if one replaces h−1j K(h −1 j (X j i −xj)) by Khj (X j i , x j) in the definitions of the kernel density estimators. In f...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University