## Bayes Estimate and Inference for Entropy and Information Index of Fit

Citations: | 2 - 1 self |

### BibTeX

@MISC{Mazzuchi_bayesestimate,

author = {Thomas A. Mazzuchi and Ehsan S. Soofi and Refik Soyer},

title = {Bayes Estimate and Inference for Entropy and Information Index of Fit},

year = {}

}

### OpenURL

### Abstract

Kullback-Leibler information is widely used for developing indices of distributional fit. The most celebrated of such indices is Akaike’s AIC, which is derived as an estimate of the minimum Kullback-Leibler information between the unknown data-generating distribution and a parametric model. In the derivation of AIC, the entropy of the data-generating distribution is bypassed because it is free from the parameters. Consequently, the AIC type measures provide criteria for model comparison purposes only, and do not provide information diagnostic about the model fit. A nonparametric estimate of entropy of the data-generating distribution is needed for assessing the model fit. Several entropy estimates are available and have been used for frequentist inference about information fit indices. A few entropy-based fit indices have been suggested for Bayesian inference. This paper develops a class of entropy estimates and provides a procedure for Bayesian inference on the entropy and a fit index. For the continuous case, we define a quantized entropy that approximates and converges to the entropy integral. The quantized entropy includes some well known measures of sample entropy and the existing Bayes entropy estimates as its special cases. For inference about the fit, we use the candidate model as the expected distribution in the Dirichlet process prior and derive the posterior mean of the quantized entropy as the Bayes estimate. The maximum entropy characterization of the candidate model is then used to derive the prior and posterior distributions for the Kullback-Leibler information index of fit. The consistency of the proposed Bayes estimates for the entropy and for the information index are shown. As by-products, the procedure also produces priors and posteriors for the model parameters and the moments.

### Citations

9231 | Elements of Information Theory - Cover, Thomas - 1990 |

2423 | A new look at the statistical model identification - Akaike - 1974 |

787 |
A Bayesian Analysis of Some Nonparametric Problems
- FERGUSON
- 1973
(Show Context)
Citation Context ... ···,xn) from unknown F , the Bayes entropy estimate is defined by the mean of a posterior distribution of the quantized entropy (11), ˜Hm,q(F ) ≡ E[Hm,q(F )|x]. 6sWe use the Dirichlet process prior (=-=Ferguson 1973-=-) for the unknown F , F (ξk)|B,F ∗ ∼D(B,F ∗ ), where F ∗ is a prior guess for F and B > 0 is the strength of belief parameter. For any partition (9) of ℜ, the increments ∆Fk, k=1, ···,q have the Diric... |

203 | Bayesian measures of model complexity and fit - Spiegelhalter, Best, et al. - 2002 |

180 |
Computing Dirichlet tessellations
- Bowyer
- 1981
(Show Context)
Citation Context .... Theorem 2 Let F be a distribution with an absolutely continuous density f. Then ˜ Hm,q(F ) based on a histogram type partition is consistent. Another useful partition is the Dirichlet tessellation (=-=Bowyer 1981-=-) defined by Tk = {x : |x − ξk| < |x − ξj| for all j �= k}. An advantage of the Dirichlet tessellation is that each data point xi, i =1, ···,n is placed in an interval (ξi, ξi+1), i =0, ···,n− 1. The ... |

81 |
Expected information as expected utility
- Bernardo
- 1979
(Show Context)
Citation Context ...ave used a histogram-type entropy estimate for model fitting in generalized gamma family. Bayesian estimation of entropy is closely related to the notion of expected information in Bayesian analysis (=-=Bernardo 1979-=- and Zellner 1991) and to the notion of average entropy in the communication theory (Campbell 1995). This paper develops a class of entropy estimates and provides a procedure for Bayesian inference on... |

73 | der Meulen, “Non-parametric entropy estimation: an overview - Beirlant, Dudewicz, et al. - 1997 |

68 | A test for normality based on sample entropy - Vasicek - 1976 |

32 | The direct use of likelihood for significance testing - Dempster - 1974 |

32 |
On the Estimation of Entropy,” The
- Hall, Morton
- 1993
(Show Context)
Citation Context ... q� ∆ ˜ Fk(log ˜ B + log ∆ ˜ Fm,k − log ξm,k), k=1 Proof of Theorem 2. For large n the sample dominates the prior in (13) and ∆ ˜ F0,k → ∆ ˆ Fk as n →∞. Then the consistency of the histogram entropy (=-=Hall and Morton 1993-=-) gives the result. Proof of Theorem 3. For large n the sample dominates the prior in (13), i.e., ∆ ˜ Fm,i → ∆ ˆ Fm,i as n →∞. (i) Since for m =0, ∆Fm,i =∆Fi and n is large, the approximation (15) is ... |

15 | Diagnostic Measures for Model Criticism - Carota, Parmigiani, et al. - 1996 |

12 | Der Meulen - van - 1971 |

12 | Information distinguishability with application to Analysis of failure data - Soofi, Ebrahimi, et al. - 1995 |

8 | Implications of reference priors for prior information and for sample size - Clarke - 1996 |

8 | On entropy-based goodness-of-fit tests - Gokhale - 1983 |

7 | Testing Exponentiality Based on Kullback-Leibler Information - Ebrahimi, Soofi, et al. - 1992 |

5 | On the overall sensitivity of the posterior distribution to its inputs - Clarke, Gustafson - 1998 |

5 |
Computations of maximum entropy Dirichlet for modeling lifetime data, Computational Statistics and Data Analysis 32
- Mazzuchi, Soofi, et al.
- 2000
(Show Context)
Citation Context ...opy that approximates and converges to the entropy integral. The quantized entropy includes some well known measures of sample entropy and the existing Bayes entropy estimates (Gill and Joanes (1979, =-=Mazzuchi et al. 2000-=-), Dadpay et al. 2006) as its special cases. We explore the large sample properties of the Bayes estimates of entropy and the fit index. For constructing information indices of fit, the parametric mod... |

5 | An information criterion for likelihood selection - Yuan, Clarke - 1999 |

4 |
Information Measures for Generalized Gamma Family
- Dadpay, Soofi, et al.
(Show Context)
Citation Context ...nd converges to the entropy integral. The quantized entropy includes some well known measures of sample entropy and the existing Bayes entropy estimates (Gill and Joanes (1979, Mazzuchi et al. 2000), =-=Dadpay et al. 2006-=-) as its special cases. We explore the large sample properties of the Bayes estimates of entropy and the fit index. For constructing information indices of fit, the parametric model f ∗ (x|θ) is selec... |

3 | Information theoretic regression methods. Advances in Econometrics: Applying Maximum Entropy to Econometric Problems - Soofi - 1997 |

2 |
Averaging Entropy
- Campbell
- 1995
(Show Context)
Citation Context ... estimation of entropy is closely related to the notion of expected information in Bayesian analysis (Bernardo 1979 and Zellner 1991) and to the notion of average entropy in the communication theory (=-=Campbell 1995-=-). This paper develops a class of entropy estimates and provides a procedure for Bayesian inference on the entropy and a fit index. For the continuous case, we define a quantized entropy that approxim... |

1 | A Test for Normality Based - Arizono, Ohta - 1989 |

1 |
Bayesian Methods and Entropy
- Zellner
- 1991
(Show Context)
Citation Context ...am-type entropy estimate for model fitting in generalized gamma family. Bayesian estimation of entropy is closely related to the notion of expected information in Bayesian analysis (Bernardo 1979 and =-=Zellner 1991-=-) and to the notion of average entropy in the communication theory (Campbell 1995). This paper develops a class of entropy estimates and provides a procedure for Bayesian inference on the entropy and ... |