## A Large-Sample Model Selection Criterion Based on Kullback's Symmetric Divergence (1999)

Venue: | Statistical and Probability Letters |

Citations: | 11 - 1 self |

### BibTeX

@ARTICLE{Cavanaugh99alarge-sample,

author = {Joseph E. Cavanaugh},

title = {A Large-Sample Model Selection Criterion Based on Kullback's Symmetric Divergence},

journal = {Statistical and Probability Letters},

year = {1999},

volume = {42},

pages = {33334--3}

}

### OpenURL

### Abstract

The Akaike information criterion, AIC, is a widely known and extensively used tool for statistical model selection. AIC serves as an asymptotically unbiased estimator of a variant of Kullback's directed divergence between the true model and a fitted approximating model. The directed divergence is an asymmetric measure of separation between two statistical models, meaning that an alternate directed divergence may be obtained by reversing the roles of the two models in the definition of the measure. The sum of the two directed divergences is Kullback's symmetric divergence. Since the symmetric divergence combines the information in two related though distinct measures, it functions as a gauge of model disparity which is arguably more sensitive than either of its individual components. With this motivation, we propose a model selection criterion which serves as an asymptotically unbiased estimator of a variant of the symmetric divergence between the true model and a fitted approximating model. We examine the performance of the criterion relative to other well-known criteria in a simulation study. Keywords: AIC, Akaike information criterion, I-divergence, J-divergence, Kullback-Leibler information, relative entropy. Correspondence: Joseph E. Cavanaugh, Department of Statistics, 222 Math Sciences Bldg., University of Missouri, Columbia, MO 65211. y This research was supported by NSF grant DMS--9704436. 1.