Results 1  10
of
51
Probabilistic Sensitivity Analysis of Complex Models: A Bayesian Approach
 Journal of the Royal Statistical Society, Series B
, 2002
"... this paper, we use the weak form of this prior, p( ; . This implies an in nite prior variance of (x), whereas in practice we expect there to be cases when the model developer can provide some proper prior knowledge about the function (:). We would not expect them to propose values for a, d, ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
this paper, we use the weak form of this prior, p( ; . This implies an in nite prior variance of (x), whereas in practice we expect there to be cases when the model developer can provide some proper prior knowledge about the function (:). We would not expect them to propose values for a, d, z and V in (14) directly, but suitable values can be found by asking the developer to estimate various percentiles of (x), and then nding a, d, z and V such that the implied percentiles through the Gaussian process model are similar. This process is described in detail in Oakley (2002)
Bayesian Inference for Nonstationary Spatial Covariance Structure via Spatial Deformations
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY, SERIES B
, 2000
"... In geostatistics it is common practice to assume that the underlying spatial process is stationary and isotropic, that is the spatial distribution is unchanged when the origin of the index set is translated and the process is stationary under rotations about the origin. However in environmental p ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
In geostatistics it is common practice to assume that the underlying spatial process is stationary and isotropic, that is the spatial distribution is unchanged when the origin of the index set is translated and the process is stationary under rotations about the origin. However in environmental problems, it is not very realistic to make such assumptions since local influences in the correlation structure of the spatial process may be clearly found in the data. This paper proposes a Bayesian model wherein the main aim is to address the anisotropy problem. Following Sampson and Guttorp (1992), we define the correlation function of the spatial process by reference to a latent space, denoted by D, where stationarity and isotropy hold. The space where the gauged monitoring sites lie is denoted by G. We adopt a Bayesian approach in which the mapping between G space and D space is represented by an unknown function d(:). A Gaussian process prior distribution is defined for d(:). ...
Portfolio Selection with Higher Moments,” Working Paper
, 2003
"... We propose a method for optimal portfolio selection using a Bayesian framework that addresses two major shortcomings of the Markowitz approach: the ability to handle higher moments and estimation error. We employ the skew normal distribution which has many attractive features for modeling multivaria ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
We propose a method for optimal portfolio selection using a Bayesian framework that addresses two major shortcomings of the Markowitz approach: the ability to handle higher moments and estimation error. We employ the skew normal distribution which has many attractive features for modeling multivariate returns. Our results suggest that it is important to incorporate higher order moments in portfolio selection. Further, our comparison to other methods where parameter uncertainty is either ignored or accommodated in an ad hoc way, shows that our approach leads to higher expected utility than the resampling methods that are common in the practice of finance.
Statistical Methods for Eliciting Probability Distributions
 Journal of the American Statistical Association
, 2005
"... Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticia ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticians informed by the fruits of a long line of psychological research into how people represent uncertain information cognitively, and how they respond to questions about that information. In a discussion of the elicitation process, the first issue to address is what it means for an elicitation to be successful, i.e. what criteria should be employed? Our answer is that a successful elicitation faithfully represents the opinion of the person being elicited. It is not necessarily “true ” in some objectivistic sense, and cannot be judged that way. We see elicitation as simply part of the process of statistical modeling. Indeed in a hierarchical model it is ambiguous at which point the likelihood ends and the prior begins. Thus the same kinds of judgment that inform statistical modeling in general also inform elicitation of prior distributions.
Deviance Information Criterion for Comparing Stochastic Volatility Models
 Journal of Business and Economic Statistics
, 2002
"... Bayesian methods have been efficient in estimating parameters of stochastic volatility models for analyzing financial time series. Recent advances made it possible to fit stochastic volatility models of increasing complexity, including covariates, leverage effects, jump components and heavytailed d ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
Bayesian methods have been efficient in estimating parameters of stochastic volatility models for analyzing financial time series. Recent advances made it possible to fit stochastic volatility models of increasing complexity, including covariates, leverage effects, jump components and heavytailed distributions. However, a formal model comparison via Bayes factors remains difficult. The main objective of this paper is to demonstrate that model selection is more easily performed using the deviance information criterion (DIC). It combines a Bayesian measureoffit with a measure of model complexity. We illustrate the performance of DIC in discriminating between various different stochastic volatility models using simulated data and daily returns data on the S&P100 index.
A Discussion of Parameter and Model Uncertainty in Insurance
 in Insurance,” Insurance: Mathematics and Economics
, 2000
"... In this paper we consider the process of modelling uncertainty. In particular we are concerned with making inferences about some quantity of interest which, at present, has been unobserved. Examples of such a quantity include the probability of ruin of a surplus process, the accumulation of an inves ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
In this paper we consider the process of modelling uncertainty. In particular we are concerned with making inferences about some quantity of interest which, at present, has been unobserved. Examples of such a quantity include the probability of ruin of a surplus process, the accumulation of an investment, the level or surplus or deficit in a pension fund and the future volume of new business in an insurance company. Uncertainty in this quantity of interest, y, arises from three sources: . uncertainty due to the stochastic nature of a given model; . uncertainty in the values of the parameters in a given model; . uncertainty in the model underlying what we are able to observe and determining the quantity of interest. It is common in actuarial science to find that the first source of uncertainty is the only one which receives rigorous attention. A limited amount of research in recent years has considered the effect of parameter uncertainty, while there is still considerable scope ...
Subjective Bayesian Analysis: Principle and practice
 BAYESIAN ANALYSIS
, 2006
"... We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.
Gibbs Sampling and Hill Climbing in Bayesian Factor Analysis
, 1998
"... Press and Shigemasu, 1989, proposed a Bayesian factor analysis model. Factor scores, factor loadings, and disturbance variances and covariances were estimated in closed form using a large sample approximation for one of the terms in the posterior distribution. This paper shows that by using Gibb ..."
Abstract

Cited by 13 (11 self)
 Add to MetaCart
Press and Shigemasu, 1989, proposed a Bayesian factor analysis model. Factor scores, factor loadings, and disturbance variances and covariances were estimated in closed form using a large sample approximation for one of the terms in the posterior distribution. This paper shows that by using Gibbs sampling or Lindley/Smith optimization ap proaches to estimation instead of the large sample approximation, both of which are possible in this model, we can obtain improved point estimators in small samples.
Penalized Likelihood
 In Encyclopedia of Statistical Sciences, Update Volume 2
, 1996
"... this article. The scope for the application of penalized likelihood is greatest in nonparametric and semiparametric regression, interpreting the term very broadly, and such applications will be emphasised here. A brief discussion of application to density estimation will also be given. The emphasis ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
this article. The scope for the application of penalized likelihood is greatest in nonparametric and semiparametric regression, interpreting the term very broadly, and such applications will be emphasised here. A brief discussion of application to density estimation will also be given. The emphasis in this article is on methodology, not theory; for careful and illuminating accounts of the asymptotic theory of penalized likelihood estimators, we refer the reader to Cox and O'Sullivan [3], and Gu and Qiu [10]. 1.1 Nonparametric regression
A BAYESIAN STATISTICAL ANALYSIS OF THE ENHANCED GREENHOUSE EFFECT
"... Abstract. This paper demonstrates that there is a robust statistical relationship between the records of the global mean surface air temperature and the atmospheric concentration of carbon dioxide over the period 1870–1991. As such, the enhanced greenhouse effect is a plausible explanation for the o ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Abstract. This paper demonstrates that there is a robust statistical relationship between the records of the global mean surface air temperature and the atmospheric concentration of carbon dioxide over the period 1870–1991. As such, the enhanced greenhouse effect is a plausible explanation for the observed global warming. Long term natural variability is another prime candidate for explaining the temperature rise of the last century. Analysis of natural variability from paleoreconstructions, however, shows that human activity is so much more likely an explanation that the earlier conclusion is not refuted. But, even if one believes in large natural climatic variability, the odds are invariably in favour of the enhanced greenhouse effect. The above conclusions hold for a range of statistical models, including one that is capable of describing the stabilization of the global mean temperature from the 1940s to the 1970s onwards. This model is also shown to be otherwise statistically adequate. The estimated climate sensitivity is about 3.8 C with a standard deviation of 0.9 C, but depends slightly on which model is preferred and how much natural variability is allowed. These estimates neglect, however, the fact that carbon dioxide is but one of a number of greenhouse gases and that sulphate aerosols may well have dampened warming. Acknowledging the fact that carbon dioxide is used as a proxy for all human induced changes in radiative forcing brings a lot of