Bootstrap Estimate of Kullback-Leibler Information for Model Selection

Bootstrap Estimate of Kullback-Leibler Information for Model Selection

Report Number
424
Authors
Ritei SHIBATA
Citation
Prob. Th. Rel. Fields. 106 299-329, 1996
Abstract

Estimation of Kullback-Leibler information is a crucial part of deriving a statistical model selection procedure which, like AIC, is based on likelihood principle. To discriminate between nested models, we have to estimate the Kullback-Leibler information up to the order of a constant, while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example of how to fulfill this requirement and it is in fact a simple minded bias correction to the maximum log likelihood. However there is no assurance that such a bias correction yields a good estimate of Kullback-Leibler information. In this paper as an alternative, bootstrap type estimation is considered. We will first show that both bootstrap estimates proposed by Efron (1983,1986), Efron and Tibshirani(1993) and Cavanaugh and Shumway(1994) are at least asymptotically equivalent and there exist many other equivalent bootstrap estimates. We also show that all such methods are asymptotically equivalent to a non-bootstrap method, known as TIC ( Takeuchi's Information Criterion) which is a generalization of AIC. To see difference of such asymptoticall equivalent methods for small samples, we give some results of simulations in the last section.

PDF File
Postscript File