Results 1 
8 of
8
Diagnostic Measures for Model Criticism
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1996
"... ... In this article we present the general outlook and discuss general families of elaborations for use in practice; the exponential connection elaboration plays a key role. We then describe model elaborations for use in diagnosing: departures from normality, goodness of fit in generalized linear mo ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
... In this article we present the general outlook and discuss general families of elaborations for use in practice; the exponential connection elaboration plays a key role. We then describe model elaborations for use in diagnosing: departures from normality, goodness of fit in generalized linear models, and variable selection in regression and outlier detection. We illustrate our approach with two applications.
A Minimally Informative Likelihood for Decision Analysis: Robustness and Illustration
 Canadian Journal Statistics
, 1999
"... Here we use a class of likelihoods which makes weak assumptions on data generating mechanisms. These likelihoods may be appropriate for data sets where it is difficult to propose physically motivated models. We give some properties of these likelihoods, showing how they can be computed numerically b ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Here we use a class of likelihoods which makes weak assumptions on data generating mechanisms. These likelihoods may be appropriate for data sets where it is difficult to propose physically motivated models. We give some properties of these likelihoods, showing how they can be computed numerically by use of the BlahutArimoto algorithm. Then, in the context of a data set for which no plausible physical model is apparent, we show how these likelihoods give useful inferences for the location of a distribution. The plausibility of the inferences is enhanced by the extensive robustness analysis these likelihoods permit.
Bayes Estimate and Inference for Entropy and Information Index of Fit
"... KullbackLeibler information is widely used for developing indices of distributional fit. The most celebrated of such indices is Akaike’s AIC, which is derived as an estimate of the minimum KullbackLeibler information between the unknown datagenerating distribution and a parametric model. In the d ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
KullbackLeibler information is widely used for developing indices of distributional fit. The most celebrated of such indices is Akaike’s AIC, which is derived as an estimate of the minimum KullbackLeibler information between the unknown datagenerating distribution and a parametric model. In the derivation of AIC, the entropy of the datagenerating distribution is bypassed because it is free from the parameters. Consequently, the AIC type measures provide criteria for model comparison purposes only, and do not provide information diagnostic about the model fit. A nonparametric estimate of entropy of the datagenerating distribution is needed for assessing the model fit. Several entropy estimates are available and have been used for frequentist inference about information fit indices. A few entropybased fit indices have been suggested for Bayesian inference. This paper develops a class of entropy estimates and provides a procedure for Bayesian inference on the entropy and a fit index. For the continuous case, we define a quantized entropy that approximates and converges to the entropy integral. The quantized entropy includes some well known measures of sample entropy and the existing Bayes entropy estimates as its special cases. For inference about the fit, we use the candidate model as the expected distribution in the Dirichlet process prior and derive the posterior mean of the quantized entropy as the Bayes estimate. The maximum entropy characterization of the candidate model is then used to derive the prior and posterior distributions for the KullbackLeibler information index of fit. The consistency of the proposed Bayes estimates for the entropy and for the information index are shown. As byproducts, the procedure also produces priors and posteriors for the model parameters and the moments.
Information Optimality and Bayesian Modeling a ∗
"... The general approach of treating a statistical problem as one of information processing led to the Bayesian method of moments, reference priors, minimal information likelihoods, and stochastic complexity. These techniques rest on quantities that have physical itnerpretations from information theory. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The general approach of treating a statistical problem as one of information processing led to the Bayesian method of moments, reference priors, minimal information likelihoods, and stochastic complexity. These techniques rest on quantities that have physical itnerpretations from information theory. Current work includes: the role of prediction, the emergence of data dependent priors, the role of information measures in model selection, and the use of conditional mutual information to incorporate partial information. Key words: entropy, Bayesian method of moments, reference priors, stochastic complexity, data dependent priors
Bayesian Hypothesis Testing in Latent Variable Models
, 2010
"... Hypothesis testing using Bayes factors (BFs) is known to suffer from several problems in the context of latent variable models. The first problem is computational. Another problem is that BFs are not well defined under the improper prior. In this paper, a new Bayesian method, based on decision theo ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Hypothesis testing using Bayes factors (BFs) is known to suffer from several problems in the context of latent variable models. The first problem is computational. Another problem is that BFs are not well defined under the improper prior. In this paper, a new Bayesian method, based on decision theory and the EM algorithm, is introduced to test a point hypothesis in latent variable models. The new statistic is a byproduct of the Bayesian MCMC output and, hence, easy to compute. It is shown that the new statistic is appropriately defined under improper priors because the method employs a continuous loss function. The finite sample properties are examined using simulated data. The method is also illustrated in the context of a onefactor asset pricing model and a stochastic volatility model with jumps using real data.
Information measures in Perspective
, 2010
"... Informationtheoretic methodologies are increasingly being used in various disciplines. Frequently an information measure is adapted for a problem, yet the perspective of information as the unifying notion is overlooked. We set forth this perspective through presenting informationtheoretic methodol ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Informationtheoretic methodologies are increasingly being used in various disciplines. Frequently an information measure is adapted for a problem, yet the perspective of information as the unifying notion is overlooked. We set forth this perspective through presenting informationtheoretic methodologies for a set of problems in probability and statistics. Our focal measures are Shannon entropy and KullbackLeibler information. The background topics for these measures include notions of uncertainty and information, their axiomatic foundation, interpretations, properties, and generalizations. Topics with broad methodological applications include discrepancy between distributions, derivation of probability models, dependence between variables, and Bayesian analysis. More specific methodological topics include model selection, limiting distributions, optimal prior distribution and design of experiment, modeling duration variables, order statistics, data disclosure, and relative importance of predictors. Illustrations range from very basic to highly technical ones that draw attention to subtle points.
Theoretic Approach
, 2006
"... The author(s) shown below used Federal funds provided by the U.S. Department of Justice and prepared the following final report: ..."
Abstract
 Add to MetaCart
The author(s) shown below used Federal funds provided by the U.S. Department of Justice and prepared the following final report: