Results 11  20
of
31
GENERALIZED FIDUCIAL INFERENCE FOR NORMAL LINEAR MIXED MODELS
"... While linear mixed modeling methods are foundational concepts introduced in any statistical education, adequate general methods for interval estimation involving models with more than a few variance components are lacking, especially in the unbalanced setting. Generalized fiducial inference provides ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
While linear mixed modeling methods are foundational concepts introduced in any statistical education, adequate general methods for interval estimation involving models with more than a few variance components are lacking, especially in the unbalanced setting. Generalized fiducial inference provides a possible framework that accommodates this absence of methodology. Under the fabric of generalized fiducial inference along with sequential Monte Carlo methods, we present an approach for interval estimation for both balanced and unbalanced Gaussian linear mixed models. We compare the proposed method to classical and Bayesian results in the literature in a simulation study of twofold nested models and twofactor crossed designs with an interaction term. The proposed method is found to be competitive or better when evaluated based on frequentist criteria of empirical coverage and average length of confidence intervals for small sample sizes. A MATLAB implementation of the proposed algorithm is available from the authors. 1. Introduction. Inference
Estimation in Dirichlet Random Effects Models
 Center for Applied Statistics, Washington University
, 2008
"... ar ..."
Sensitivity of the fractional Bayes factor to prior distributions
, 2000
"... The authors derive a measure of the sensitivity of the fractional Bayes factor, an index which is used to compare models when the priors for their respective parameters are improper, or when there is concern about robustness of the prior specification. They prove that in a large class of problems, t ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The authors derive a measure of the sensitivity of the fractional Bayes factor, an index which is used to compare models when the priors for their respective parameters are improper, or when there is concern about robustness of the prior specification. They prove that in a large class of problems, this measure is a decreasing function of the fraction of the sample used to update the prior distribution before the models are compared. R ESUM E Les auteurs proposent une mesure de la sensibilite du facteur de Bayes fractionnaire, un indice de comparaison de modeles employe lorsque l'on s'inquiete de la robustesse des lois a priori sur les parametres ou que cellesci sont impropres. Ils demontrent que dans beaucoup de situations, cette mesure decrot comme fonction de la fraction de l'echantillon utilisee pour mettre a jour les lois a priori avant de comparer les modeles. 1. INTRODUCTION Suppose we are comparing two models, M 1 and M 2 , and let f i (x  # i )and# 0 i be respectively ...
Empirical Bayesian Test of the Smoothness
, 2007
"... Abstract—In the context of adaptive nonparametric curve estimation a common assumption is that a function (signal) to estimate belongs to a nested family of functional classes. These classes are often parametrized by a quantity representing the smoothness of the signal. It has already been realized ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract—In the context of adaptive nonparametric curve estimation a common assumption is that a function (signal) to estimate belongs to a nested family of functional classes. These classes are often parametrized by a quantity representing the smoothness of the signal. It has already been realized by many that the problem of estimating the smoothness is not sensible. What can then be inferred about the smoothness? The paper attempts to answer this question. We consider implications of our results to hypothesis testing about the smoothness and smoothness classification problem. The test statistic is based on the empirical Bayes approach, i.e., it is the marginalized maximum likelihood estimator of the smoothness parameter for an appropriate prior distribution on the unknown signal. Key words: empirical Bayes approach, hypothesis testing, smoothness parameter, white noise model.
Convergence of Posterior Distribution in the Mixture of Regressions
, 2006
"... Mixture models provide a method of modeling a complex probability distribution in terms of simpler structures. In particular, the method of mixture of regressions has received considerable attention due to its modeling flexibility and availability of convenient computational algorithms. While the th ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Mixture models provide a method of modeling a complex probability distribution in terms of simpler structures. In particular, the method of mixture of regressions has received considerable attention due to its modeling flexibility and availability of convenient computational algorithms. While the theoretical justification has been successfully worked out from the frequentist point of view, its Bayesian counterpart has not been fully investigated. This paper aims to contribute to theoretical justification for the mixtures of regression model from the Bayesian perspective. In particular, we establish strong consistency of posterior distribution and determine how fast posterior distribution converges to the true value of the parameter in the context of mixture of binary regressions, Poisson regressions and Gaussian regressions.
A REVIEW OF CONSISTENCY AND CONVERGENCE OF POSTERIOR DISTRIBUTION
"... In this article, we review two important issues, namely consistency and convergence of posterior distribution, that arise in Bayesian inference with large samples. Both parametric and nonparametric cases are considered. In this article we address the issues of consistency and convergence of posteri ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
In this article, we review two important issues, namely consistency and convergence of posterior distribution, that arise in Bayesian inference with large samples. Both parametric and nonparametric cases are considered. In this article we address the issues of consistency and convergence of posterior distribution. This review is aimed for nonspecialists. Technical conditions (like measurability) and mathematical expressions are avoided as far as possible. No proof of the results mentioned are given here. Unsatisfied readers are encouraged to look at the references mentioned. The list of the references is certainly not exhaustive, but other references may be found from the mentioned ones. A general reference on the topic of Bayesian asymptotics is Ghosh and Ramamoorthi [36]. In Bayesian analysis, one starts with a prior knowledge (sometimes imprecise) expressed as a distribution on the parameter space and updates the knowledge according to the posterior distribution given the data. It is therefore of utmost importance to know whether the updated knowledge becomes more and more accurate and precise as data
Proportional mean regression models for censored data
, 2003
"... A novel semiparametric regression model for censored data is proposed as an alternative to the widely used proportional hazards survival model. The proposed regression model for censored data turns out to be flexible and practically meaningful. Features include physical interpretation of the regress ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
A novel semiparametric regression model for censored data is proposed as an alternative to the widely used proportional hazards survival model. The proposed regression model for censored data turns out to be flexible and practically meaningful. Features include physical interpretation of the regression coefficients through the mean response time instead of the hazard functions. It is shown that the regression model obtained by a mixture of parametric families, has a proportional mean structure (as in an accelerated failure time models). The statistical inference is based on a nonparametric Bayesian approach that uses a Dirichlet process prior for the mixing distribution. Consistency of the posterior distribution of the regression parameters in the Euclidean metric is established under certain conditions. Finite sample parameter estimates along with associated measure of uncertainties can be computed by a MCMC method. Simulation studies are presented to provide empirical validation of the new method. Some real data examples are provided to show the easy applicability of the proposed method.
Stability and Approximation of Nonlinear Filters: an Information Theoretic Approach
, 2000
"... It has recently been proved by Clark, Ocone and Coumarbatch that the relative entropy (or Kullback Leibler information distance) between two nonlinear filters with different initial conditions is a supermartingale, hence its expectation can only decrease with time. This result was obtained for a v ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
It has recently been proved by Clark, Ocone and Coumarbatch that the relative entropy (or Kullback Leibler information distance) between two nonlinear filters with different initial conditions is a supermartingale, hence its expectation can only decrease with time. This result was obtained for a very general model, where the unknown state and observation processes form jointly a continuoustime Markov process. The purpose of this paper is (i) to extend this result to a large class of fdivergences, including the total variation distance, the Hellinger distance, and not only the KullbackLeibler information distance, and (ii) to consider not only robustness w.r.t. the initial condition of the filter, but also w.r.t. perturbation of the state generator. On the other hand, the model considered here is much less general, and consists of a diffusion process observed in discretetime. Keywords : nonlinear filtering, stability, relative entropy, KullbackLeibler information, Hellinger...
Sensitivity measures of the fractional Bayes factor
, 1996
"... . Bayesian model comparison typically requires calculation of the Bayes factor. In recent years, several alternative Bayes factors have been introduced to address the problem of sensitivity to prior assumptions. Among these alternatives, the fractional Bayes factor makes an important contribution, o ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. Bayesian model comparison typically requires calculation of the Bayes factor. In recent years, several alternative Bayes factors have been introduced to address the problem of sensitivity to prior assumptions. Among these alternatives, the fractional Bayes factor makes an important contribution, on the grounds of consistency, robustness and coherence. Sensitivity of the fractional Bayes factor is easy to assess when the prior distributions are proper. On the other hand, when the priors are improper, most methods lead to trivial answers. In this paper we derive a measure of the sensitivity of the fractional Bayes factor with respect to improper priors, and we illustrate a possible use of this measure for the selection of the fraction of the data to be used for training. 1. Introduction Suppose we are comparing two models, M 1 and M 2 , and let f i (x j ` i ) and \Pi 0 i be respectively the distribution of the data and the prior distribution of the parameters ` i under model M i ....