## Information and Posterior Probability Criteria for Model Selection in Local Likelihood Estimation (1998)

Venue: | J Amer. Stat. Ass |

Citations: | 2 - 0 self |

### BibTeX

@ARTICLE{Irizarry98informationand,

author = {Rafael A. Irizarry},

title = {Information and Posterior Probability Criteria for Model Selection in Local Likelihood Estimation},

journal = {J Amer. Stat. Ass},

year = {1998},

volume = {96},

pages = {303--315}

}

### OpenURL

### Abstract

this paper we propose a modification to the methods used to motivate many information and posterior probability criteria for the weighted likelihood case. We derive weighted versions for two of the most widely known criteria, namely the AIC and BIC. Via a simple modification, the criteria are also made useful for window span selection. The usefulness of the weighted version of these criteria are demonstrated through a simulation study and an application to three data sets. KEY WORDS: Information Criteria; Posterior Probability Criteria; Model Selection; Local Likelihood. 1. INTRODUCTION Local regression has become a popular method for smoothing scatterplots and for nonparametric regression in general. It has proven to be a useful tool in finding structure in datasets (Cleveland and Devlin 1988). Local regression estimation is a method for smoothing scatterplots (x i ; y i ), i = 1; : : : ; n in which the fitted value at x 0 is the value of a polynomial fit to the data using weighted least squares where the weight given to (x i ; y i ) is related to the distance between x i and x 0 . Stone (1977) shows that estimates obtained using the local regression methods have desirable theoretical properties. Recently, Fan (1993) has studied minimax properties of local linear regression. Tibshirani and Hastie (1987) extend the ideas of local regression to a local likelihood procedure. This procedure is designed for nonparametric regression modeling in situations where weighted least squares is inappropriate as an estimation method, for example binary data. Local regression may be viewed as a special case of local likelihood estimation. Tibshirani and Hastie (1987), Staniswalis (1989), and Loader (1999) apply local likelihood estimation to several types of data where local regressio...

### Citations

2307 | Estimating the dimension of a model - SCHWARZ - 1978 |

1235 | Information theory and an extension of the maximum likelihood principle - Akaike - 1973 |

1154 | Information theory and statistics
- Kullback
(Show Context)
Citation Context ...t is "nearest" to the true model, defined by (5), based on the observed data y. The principle behind information criteria is to define "nearest" using the Kullback-Leibler discrimi=-=nation information (Kullback 1959) K fgY (y-=-) : fY (yjX; fi)g = Z g Y (y) flog g Y (y) \Gamma log fY (yjX; fi)g dy: (7) As done by Sawa (1978), we say that M q is the "nearest" or best approximating model amongst the models defined by... |

255 |
Locally weighted regression: An approach to regression analysis by local fitting
- Cleveland, Devlin
- 1988
(Show Context)
Citation Context .... INTRODUCTION Local regression has become a popular method for smoothing scatterplots and for nonparametric regression in general. It has proven to be a useful tool in finding structure in datasets (=-=Cleveland and Devlin 1988-=-). Local regression estimation is a method for smoothing scatterplots (x i ; y i ), i = 1; : : : ; n in which the fitted value at x 0 is the value of a polynomial fit to the data using weighted least ... |

183 |
Regression and time series model selection in small samples
- Hurvich, Tsai
- 1989
(Show Context)
Citation Context ...l work is for independent identically distributed (IID) data, however it is extended to a regression Rafael A. Irizarry: Model Selection for Local Likelihood 6 type setting in a straight-forward way (=-=Hurvich and Tsai 1989-=-). The approach is to choose the approximate model producing the estimatesfi p that minimizes EY h K n gY (y) : fY (yjX;sfi p ) oi with the expectation taken under the true distribution of the Y. Sinc... |

176 | Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions - Bozdogan - 1987 |

149 | Local Regression and Likelihood - Loader - 1999 |

129 | Local linear regression smoothers and their minimax efficiencies - Fan - 1993 |

57 | Musical Sound Signals Analysis/Synthesis: Sinusoidal+Residual and Elementary Waveform Models - Rodet - 1997 |

36 | Local likelihood density estimation - Loader - 1996 |

32 | On the kernel estimate of a regression function in likelihood based models - Staniswalis - 1989 |

30 | Generalized Additive Models: Some Applications - Hastie, Tibshirani - 1987 |

30 | Information Criteria for Discriminating Among Alternative Regression Models - Sawa - 1978 |

27 | Distribution of informational statistics and a criterion of model fitting Suri-Kagaku (Mathematics Sciences - Takeuchi - 1976 |

22 | Asymptotic normality of prediction error estimators for approximate system models - Ljung, Caines - 1979 |

19 | Optimal choice of AR and MA parts in autoregressive moving average models - Kashyap - 1982 |

18 |
The information in Contingency Tables
- Gokhale, Kullback
- 1978
(Show Context)
Citation Context ..." x 0 , it seems appropriate to consider a discrepancy measure that takes this into account. In this paper we propose the use of a weighted version of the Kullback-Leibler discrimination informat=-=ion (Gokhale and Kullback 1978-=-) and use this to derive appropriate model selection information criteria. 3.2 Weighted Information Criteria Because in local estimation we are interested in estimating only ` 0 , we say, as done by S... |

16 | Regression and time series model selection using variants of the Schwarz information criterion - Neath, Cavanaugh - 1997 |

14 | A comparison of the information and posterior probability criteria for model selection - Chow - 1981 |

14 | Statistical aspects of model selection - Shibata - 1989 |

13 | Consistent nonparametric regression. The Annals of Statistics - Stone - 1977 |

3 | Mixture-model cluster analysis using a new informational complexity and model selection criteria. In: H. Bozdogan, (Eds.), multivariate statistical modeling, vol 2 - Bozdogan - 1994 |

2 | Information Theoretic Regression Methods - Soofi - 1997 |

1 | Irizarry: Model Selection for Local Likelihood 26 - Rafael - 1987 |

1 | On the application of linear models to local regions in regression - Ohtaki - 1985 |