Results 1  10
of
48
Assessment and Propagation of Model Uncertainty
, 1995
"... this paper I discuss a Bayesian approach to solving this problem that has long been available in principle but is only now becoming routinely feasible, by virtue of recent computational advances, and examine its implementation in examples that involve forecasting the price of oil and estimating the ..."
Abstract

Cited by 108 (0 self)
 Add to MetaCart
this paper I discuss a Bayesian approach to solving this problem that has long been available in principle but is only now becoming routinely feasible, by virtue of recent computational advances, and examine its implementation in examples that involve forecasting the price of oil and estimating the chance of catastrophic failure of the U.S. Space Shuttle.
The Consistency of Posterior Distributions in Nonparametric Problems
 Ann. Statist
, 1996
"... We give conditions that guarantee that the posterior probability of every Hellinger... ..."
Abstract

Cited by 79 (4 self)
 Add to MetaCart
We give conditions that guarantee that the posterior probability of every Hellinger...
Multiscale Modeling and Estimation of Poisson Processes with Application to Photonlimited Imaging
 IEEE TRANS. ON INFO. THEORY
, 1999
"... Many important problems in engineering and science are wellmodeled by Poisson processes. In many applications it is of great interest to accurately estimate the intensities underlying observed Poisson data. In particular, this work is motivated by photonlimited imaging problems. This paper studies ..."
Abstract

Cited by 56 (10 self)
 Add to MetaCart
Many important problems in engineering and science are wellmodeled by Poisson processes. In many applications it is of great interest to accurately estimate the intensities underlying observed Poisson data. In particular, this work is motivated by photonlimited imaging problems. This paper studies a new Bayesian approach to Poisson intensity estimation based on the Haar wavelet transform. It is shown that the Haar transform provides a very natural and powerful framework for this problem. Using this framework, a novel multiscale Bayesian prior to model intensity functions is devised. The new prior leads to a simple, Bayesian intensity estimation procedure. Furthermore, we characterize the correlation behavior of the new prior and show that it has 1/f spectral characteristics. The new framework is applied to photonlimited image estimation and its potential to improve nuclear medicine imaging is examined.
A Statistical Multiscale Framework for Poisson Inverse Problems
, 2000
"... This paper describes a statistical modeling and analysis method for linear inverse problems involving Poisson data based on a novel multiscale framework. The framework itself is founded upon a multiscale analysis associated with recursive partitioning of the underlying intensity, a corresponding ..."
Abstract

Cited by 40 (4 self)
 Add to MetaCart
This paper describes a statistical modeling and analysis method for linear inverse problems involving Poisson data based on a novel multiscale framework. The framework itself is founded upon a multiscale analysis associated with recursive partitioning of the underlying intensity, a corresponding multiscale factorization of the likelihood (induced by this analysis), and a choice of prior probability distribution made to match this factorization by modeling the \splits" in the underlying partition. The class of priors used here has the interesting feature that the \noninformative" member yields the traditional maximum likelihood solution; other choices are made to reect prior belief as to the smoothness of the unknown intensity. Adopting the expectationmaximization (EM) algorithm for use in computing the MAP estimate corresponding to our model, we nd that our model permits remarkably simple, closedform expressions for the EM update equations. The behavior of our EM algorit...
Modeling Regression Error with a Mixture of Polya Trees
 Journal of the American Statistical Association
, 2001
"... We model the error distribution in the standard linear model as a mixture of absolutely continuous Polya trees constrained to have median zero. By considering a mixture, we smooth out the partitioning e ects of a simple Polya tree and the predictive error density has a derivative everywhere except z ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
We model the error distribution in the standard linear model as a mixture of absolutely continuous Polya trees constrained to have median zero. By considering a mixture, we smooth out the partitioning e ects of a simple Polya tree and the predictive error density has a derivative everywhere except zero. The error distribution is centered around a standard parametric family of distributions and may therefore be viewed as a generalization of standard models in which important, datadriven features, such as skewness and multimodality, are allowed. By marginalizing the Polya tree exact inference is possible up to MCMC error.
Consistency issues in Bayesian Nonparametrics
 IN ASYMPTOTICS, NONPARAMETRICS AND TIME SERIES: A TRIBUTE
, 1998
"... ..."
Statistical notions of data disclosure avoidance and their relationship to traditional statistical methodology: Data swapping and loglinear models
 Proc. Bureau of the Census
, 1996
"... For most data releases especially those from censuses, the U. S. Bureau of the Census has either released data at high levels of aggregation or applied a data disclosure avoidance procedure such as data swapping or cell suppression before preparing microdata or tables for release. In this paper, we ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
For most data releases especially those from censuses, the U. S. Bureau of the Census has either released data at high levels of aggregation or applied a data disclosure avoidance procedure such as data swapping or cell suppression before preparing microdata or tables for release. In this paper, we present a general statistical characterization of the goal of a statistical agency in releasing confidential data subject to the application of disclosure avoidance procedures. We use this characterization to provide a framework for the study of data disclosure avoidance procedures for categorical variables. Consider a sample of n observations on p variables, which may be discrete or continuous. Our general characterization is in terms of the smoothing of a multidimensional empirical distribution function (an ordered version of the data), and sampling from it using bootstraplike selection. Both the smoothing and the sampling introduce alterations to the data and thus a bootstrap sample will not necessarily be the same as the original sample this works to preserve the confidentiality of individuals providing the original data. Two obvious questions are: How well confidentiality is preserved by such a process? Have the smoothing and sampling disguised fundamental relationships among the p variables of interest to others who will work only with the altered data? Rubin (1993) has provided a closely related characterization and approach based on multiple imputation. We explain some of these ideas in greater detail in the context of categorical random variables and compare them to methods in current use for data disclosure avoidance such as data swapping and cell suppression. We also relate this approach the data disclosure avoidance methods to statistical analysis associated with the use of loglinear models for crossclassified categorical data.
Consistent semiparametric Bayesian inference about a location parameter
, 1995
"... We consider the problem of Bayesian inference about the centre of symmetry of a symmetric density on the real line based on independent identically distributed observations. A result of Diaconis and Freedman shows that the posterior distribution of the location parameter may be inconsistent if (symm ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
We consider the problem of Bayesian inference about the centre of symmetry of a symmetric density on the real line based on independent identically distributed observations. A result of Diaconis and Freedman shows that the posterior distribution of the location parameter may be inconsistent if (symmetrized) Dirichlet process prior is used for the unknown distribution function. We choose a symmetrized Polya tree prior for the unknown density and independently choose ` according to a continuous and positive prior density on the real line. Suppose that the parameters of Polya tree depend only on the level m of the tree and the common values r m 's are such that P 1 m=1 r \Gamma1=2 m ! 1. Then it is shown that for a large class of true symmetric densities, including the trimodal distribution of Diaconis and Freedman, the marginal posterior of ` is consistent. AMS subject classification: Primary 62G20, 62F15. Key words: Consistency, KullbackLeibler number, location parameter, Polya ...
Bayesian semiparametric dynamic frailty models for multiple event time data
 Biometrics
, 2006
"... Many biomedical studies collect data on times of occurrence for a health event that can occur repeatedly, such as infection, hospitalization, recurrence of disease, or tumor onset. To analyze such data, it is necessary to account for withinsubject dependency in the multiple event times. Motivated ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
Many biomedical studies collect data on times of occurrence for a health event that can occur repeatedly, such as infection, hospitalization, recurrence of disease, or tumor onset. To analyze such data, it is necessary to account for withinsubject dependency in the multiple event times. Motivated by data from studies of palpable tumors, this article proposes a dynamic frailty model and Bayesian semiparametric approach to inference. The widely used shared frailty proportional hazards model is generalized to allow subjectspecific frailties to change dynamically with age while also accommodating nonproportional hazards. Parametric assumptions on the frailty distribution are avoided by using Dirichlet process priors for a shared frailty and for multiplicative innovations on this frailty. By centering the semiparametric model on a conditionallyconjugate dynamic gamma model, we facilitate posterior computation and lack of fit assessments of the parametric model. Our proposed method is demonstrated using data from a cancer chemoprevention study.
Nonparametric Modelling of Hierarchically Exchangeable Data
, 2003
"... Hierarchically exchangeable data are characterized by the exchangeability of a population of units and the exchangeability of observations from each individual unit. A flexible model for such data is the hierarchical logisticnormal model, which provides unconstrained sampling distributions at the w ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Hierarchically exchangeable data are characterized by the exchangeability of a population of units and the exchangeability of observations from each individual unit. A flexible model for such data is the hierarchical logisticnormal model, which provides unconstrained sampling distributions at the withinunit level and an unconstrained covariance structure at the betweenunit level. Also, the sampling distribution at the betweenunit level is unimodal in a weak sense. Parameter estimation and inference for the hierarchical logisticnormal model is relatively straightforward via Markov chain Monte Carlo or an approximate EM algorithm. These and other features of the hierarchical logistic normal model are explored, and the model is applied to the analysis of tumor locations in a mammalian population. A comparison is made to a similar data analysis based on Dirichlet distributions.