Results 1  10
of
12
Locally efficient semiparametric estimators for generalized skewelliptical distribution
 J. Am. Statist. Ass
, 2005
"... We consider a class of generalized skewnormal distributions that is useful for selection modeling and robustness analysis and derive a class of semiparametric estimators for the location and scale parameters of the central part of the model. We show that these estimators are consistent and asymptot ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
We consider a class of generalized skewnormal distributions that is useful for selection modeling and robustness analysis and derive a class of semiparametric estimators for the location and scale parameters of the central part of the model. We show that these estimators are consistent and asymptotically normal. We present the semiparametric efficiency bound and derive the locally efficient estimator that achieves this bound if the model for the skewing function is correctly specified. The estimators that we propose are consistent and asymptotically normal even if the model for the skewing function is misspecified, and we compute the loss of efficiency in such cases. We conduct a simulation study and provide an illustrative example. Our method is applicable to generalized skewelliptical distributions.
NonParametric Classes of Weight Functions to Model Publication Bias
, 1995
"... This paper addresses the use of weight functions to model publication bias in metaanalysis. Since this bias is hard to gauge, we introduce a nonparametric "contamination class of weight functions. We then illustrate how to explore sensitivity of conclusions to the specification of the weight func ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This paper addresses the use of weight functions to model publication bias in metaanalysis. Since this bias is hard to gauge, we introduce a nonparametric "contamination class of weight functions. We then illustrate how to explore sensitivity of conclusions to the specification of the weight function by examining the range of results for the entire class. We find lower bounds on the coverage of confidence intervals. If no publication bias is present, results are robust even when considered over the entire "contamination class. However, if publication bias is present, then the coverage provided by the usual interval estimator is not robust. In this case, an alternative interval estimator is suggested. We also illustrate how both upper and lower bounds on posterior quantities of interest may be found for the case in which prior information is available. Some key words: Weight functions; Selection bias; Metaanalysis. 1 Introduction This paper addresses the use of weight functions t...
Discovery Sampling And Selection Models
 In Decision Theory and Related Topics
, 1994
"... Various aspects of Bayesian inference in selection and size biased sampling problems are presented, beginning with discussion of general problems of inference in infinite and finite populations subject to selection sampling. Estimation of the size of finite populations and inference about superpopul ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Various aspects of Bayesian inference in selection and size biased sampling problems are presented, beginning with discussion of general problems of inference in infinite and finite populations subject to selection sampling. Estimation of the size of finite populations and inference about superpopulation distributions when sampling is apparently informative is then developed in two specific problems. The first is a simple example of truncated data analysis, and some details of simulation based Bayesian analysis are presented. The second concerns discovery sampling in which units of a finite population are selected with probabilities proportional to some measure of size. A wellknown area of application is in the discovery of oil reserves, and some recently published data from this area is analysed here. Solutions to the computational problems arising are developed using iterative simulation methods. Finally, some comments are made on extensions, including multiparameter superpopulation...
Bayesian Computational Approaches to Model Selection
, 2000
"... this paper was to provide a summary of the stateof theart theory on Bayesian model selection and the application of MCMC algorithms. It has been shown how applications of considerable complexity can be handled successfully within this framework. Several methods for dealing with the use of default, ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
this paper was to provide a summary of the stateof theart theory on Bayesian model selection and the application of MCMC algorithms. It has been shown how applications of considerable complexity can be handled successfully within this framework. Several methods for dealing with the use of default, improper priors in the Bayesian model selection 506 Andrieu, Doucet et al. framework has been shown. Special care has been taken to pinpoint the subtleties of jumping from one parameter space to another, and in general, to show the construction of MCMC samplers in such scenarios. The focus in the paper was on the reversible jump MCMC algorithm as this is the most widely used of all existing methods; it is easy to use, flexible and has nice properties. Many references have been cited, with the emphasis being given to articles with signal processing applications. A Notation
Nonparametric Bayesian Data Analysis
"... We review the current state of nonparametric Bayesian inference. The discussion follows a list of important statistical inference problems, including density estimation, regression, survival analysis, hierarchical models and model validation. For each inference problem we review relevant nonparametr ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We review the current state of nonparametric Bayesian inference. The discussion follows a list of important statistical inference problems, including density estimation, regression, survival analysis, hierarchical models and model validation. For each inference problem we review relevant nonparametric Bayesian models and approaches including Dirichlet process (DP) models and variations, Polya trees, wavelet based models, neural network models, spline regression, CART, dependent DP models, and model validation with DP and Polya tree extensions of parametric models. 1
Inference in Successive Sampling Discovery Models
, 1996
"... A variety of practical problems of finite population inference can be addressed in the framework of successive sampling discovery models  population units are assumed drawn from a superpopulation distribution and then successively sampled according to a specified `sizebiased' selection mechanism. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A variety of practical problems of finite population inference can be addressed in the framework of successive sampling discovery models  population units are assumed drawn from a superpopulation distribution and then successively sampled according to a specified `sizebiased' selection mechanism. Formal statistical analysis of discovery data under such models is technically challenging, as exemplified by the likelihood analyses of Nair and Wang (1989). Assessment of uncertainties about superpopulation parameters and, more critically, appropriate forms of predictive inference for the unsampled units in the finite population, are open issues that are addressed here from a Bayesian perspective. Motivated by the likelihood analysis of Nair and Wang (1989), we develop a formal Bayesian approach to analysis in the same class of models; we show how simulation methods provide for the computation of required posterior and predictive distributions of relevance. We further develop model extens...
Fisher Information in Weighted Distributions
, 1998
"... Standard inference procedures assume a random sample from a population with density f # (x) for estimating the parameter #. However, there are many applications in which the available data are a biased sample instead. Fisher modeled biased sampling using a weight function w(x) # 0, and constructe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Standard inference procedures assume a random sample from a population with density f # (x) for estimating the parameter #. However, there are many applications in which the available data are a biased sample instead. Fisher modeled biased sampling using a weight function w(x) # 0, and constructed a weighted distribution with a density f w # (x) that is proportional to w(x)f # (x). In this paper, we assume that f # (x) belongs to an exponential family, and study the Fisher information about # in observations obtained from some commonly arising weighted distributions: (i)thek th order statistic of a random sample of size m,(ii) observations from the stationary distribution of the residual lifetime of a renewal process, and (iii) truncated distributions. We give general conditions under which the weighted distribution has greater Fisher information than the original distribution, and specialize to the normal, gamma, and Weibull distributions. These conditions involve the distribution...
Printed in Great Britain Incorporating
"... prior beliefs about selection bias into the analysis of randomized trials with missing outcomes ..."
Abstract
 Add to MetaCart
prior beliefs about selection bias into the analysis of randomized trials with missing outcomes
Bayesian Inference
"... The Bayesian interpretation of probability is one of two broad categories of interpretations. Bayesian inference updates knowledge about unknowns, parameters, with information from data. The LaplacesDemon package in R enables Bayesian inference, and this vignette provides an introduction to the topi ..."
Abstract
 Add to MetaCart
The Bayesian interpretation of probability is one of two broad categories of interpretations. Bayesian inference updates knowledge about unknowns, parameters, with information from data. The LaplacesDemon package in R enables Bayesian inference, and this vignette provides an introduction to the topic. This article introduces Bayes’ theorem, modelbased Bayesian inference, components of Bayesian inference, prior distributions, hierarchical Bayes, conjugacy, likelihood, numerical approximation, prediction, Bayes factors, model fit, posterior predictive checks, and ends by comparing advantages and disadvantages of Bayesian inference.
Presented at the
, 1999
"... It is human nature for “the affirmative or active to effect more than the negative or privative. So that a few times hitting, or presence, countervails ofttimes failing or absence.” ..."
Abstract
 Add to MetaCart
It is human nature for “the affirmative or active to effect more than the negative or privative. So that a few times hitting, or presence, countervails ofttimes failing or absence.”