Results 1  10
of
50
Correlation And Dependence In Risk Management: Properties And Pitfalls
 RISK MANAGEMENT: VALUE AT RISK AND BEYOND
, 1999
"... Modern risk management calls for an understanding of stochastic dependence going beyond simple linear correlation. This paper deals with the static (nontimedependent) case and emphasizes the copula representation of dependence for a random vector. Linear correlation is a natural dependence measure ..."
Abstract

Cited by 195 (30 self)
 Add to MetaCart
Modern risk management calls for an understanding of stochastic dependence going beyond simple linear correlation. This paper deals with the static (nontimedependent) case and emphasizes the copula representation of dependence for a random vector. Linear correlation is a natural dependence measure for multivariate normally and, more generally, elliptically distributed risks but other dependence concepts like comonotonicity and rank correlation should also be understood by the risk management practitioner. Using counterexamples the falsity of some commonly held views on correlation is demonstrated; in general, these fallacies arise from the naive assumption that dependence properties of the elliptical world also hold in the nonelliptical world. In particular, the problem of finding multivariate models which are consistent with prespecified marginal distributions and correlations is addressed. Pitfalls are highlighted and simulation algorithms avoiding these problems are constructed. ...
Understanding relationships using copulas
 North American Actuarial Journal
, 1998
"... This article introduces actuaries to the concept of "copulas, " a tool for understanding relationships among multivariate outcomes. A copula is a function that links univariate marginals to their full multivariate distribution. Copulas were introduced in 1959 in the context of probabilisti ..."
Abstract

Cited by 108 (0 self)
 Add to MetaCart
This article introduces actuaries to the concept of "copulas, " a tool for understanding relationships among multivariate outcomes. A copula is a function that links univariate marginals to their full multivariate distribution. Copulas were introduced in 1959 in the context of probabilistic metric spaces. Recently, there has been a rapidly developing literature on the statistical properties and applications of copulas. This article explores some of these practical applications, including estimation of joint life mortality and multidecrement models. In addition, we describe basic properties of copulas, their relationships to measures of dependence and several families of copulas that have appeared in the literature. An annotated bibliography provides a resource for researchers and practitioners who wish to continue their study of copulas. This article will also be useful to those who wish to use copulas for statistical inference. Statistical inference procedures are illustrated using insurance company data on losses and expenses. For this data, we (1) show how to fit copulas and (2) describe their usefulness by pricing a reinsurance contract and estimating expenses for prespecified losses.
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Uncertainty analysis of climate change and policy response
 Climatic Change
, 2003
"... Abstract. To aid climate policy decisions, accurate quantitative descriptions of the uncertainty in climate outcomes under various possible policies are needed. Here, we apply an earth systems model to describe the uncertainty in climate projections under two different policy scenarios. This study i ..."
Abstract

Cited by 51 (12 self)
 Add to MetaCart
Abstract. To aid climate policy decisions, accurate quantitative descriptions of the uncertainty in climate outcomes under various possible policies are needed. Here, we apply an earth systems model to describe the uncertainty in climate projections under two different policy scenarios. This study illustrates an internally consistent uncertainty analysis of one climate assessment modeling framework, propagating uncertainties in both economic and climate components, and constraining climate parameter uncertainties based on observation. We find that in the absence of greenhouse gas emissions restrictions, there is a one in forty chance that global mean surface temperature change will exceed 4.9 ◦C by the year 2100. A policy case with aggressive emissions reductions over time lowers the temperature change to a one in forty chance of exceeding 3.2 ◦C, thus reducing but not eliminating the chance of substantial warming. 1.
Hypercube Sampling and the Propagation of Uncertainty in Analyses of Complex Systems
, 2002
"... ..."
Quantitative analysis of variability and uncertainty in emission estimation: An illustration of methods using mixture distributions. Paper No
 11. Proceedings of the Annual Meeting of the Air & Waste Management Association
, 2001
"... ..."
MonteCarlotype techniques for processing interval uncertainty, and their potential engineering applications
 Reliable Computing
, 2007
"... Abstract. In engineering applications, we need to make decisions under uncertainty. Traditionally, in engineering, statistical methods are used, methods assuming that we know the probability distribution of different uncertain parameters. Usually, we can safely linearize the dependence of the desire ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Abstract. In engineering applications, we need to make decisions under uncertainty. Traditionally, in engineering, statistical methods are used, methods assuming that we know the probability distribution of different uncertain parameters. Usually, we can safely linearize the dependence of the desired quantities y (e.g., stress at different structural points) on the uncertain parameters xi – thus enabling sensitivity analysis. Often, the number n of uncertain parameters is huge, so sensitivity analysis leads to a lot of computation time. To speed up the processing, we propose to use special MonteCarlotype simulations.
Chessboard Distributions and Random Vectors with Specified Marginals and Covariance Matrix
 Operations Research
, 2000
"... There is a growing need for the ability to specify and generate correlated random variables as primitive inputs to stochastic models. Motivated by this need, several authors have explored the generation of random vectors with specified marginals, together with a specified covariance matrix, through ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
There is a growing need for the ability to specify and generate correlated random variables as primitive inputs to stochastic models. Motivated by this need, several authors have explored the generation of random vectors with specified marginals, together with a specified covariance matrix, through the use of a transformation of a multivariate normal random vector. A covariance matrix is said to be feasible for a given set of marginal distributions if a random vector exists with these characteristics. We develop a computational approach for establishing whether a given covariance matrix is feasible for a given set of marginals. The approach is used to rigorously establish that there are sets of marginals with feasible covariance matrix that the normal transformation technique referred to above cannot match. An important feature of our analysis is that we show that for almost any covariance matrix (in a certain precise sense), our computational procedure either explicitly provides a c...
Sensitivity in risk analyses with uncertain numbers
, 2006
"... Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is DempsterShafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a “pinching ” strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered. 3