Results 1  10
of
77
Understanding relationships using copulas
 North American Actuarial Journal
, 1998
"... This article introduces actuaries to the concept of "copulas, " a tool for understanding relationships among multivariate outcomes. A copula is a function that links univariate marginals to their full multivariate distribution. Copulas were introduced in 1959 in the context of probabilisti ..."
Abstract

Cited by 108 (0 self)
 Add to MetaCart
This article introduces actuaries to the concept of "copulas, " a tool for understanding relationships among multivariate outcomes. A copula is a function that links univariate marginals to their full multivariate distribution. Copulas were introduced in 1959 in the context of probabilistic metric spaces. Recently, there has been a rapidly developing literature on the statistical properties and applications of copulas. This article explores some of these practical applications, including estimation of joint life mortality and multidecrement models. In addition, we describe basic properties of copulas, their relationships to measures of dependence and several families of copulas that have appeared in the literature. An annotated bibliography provides a resource for researchers and practitioners who wish to continue their study of copulas. This article will also be useful to those who wish to use copulas for statistical inference. Statistical inference procedures are illustrated using insurance company data on losses and expenses. For this data, we (1) show how to fit copulas and (2) describe their usefulness by pricing a reinsurance contract and estimating expenses for prespecified losses.
An exact likelihood analysis of the multinomial probit model
, 1994
"... We develop new methods for conducting a finite sample, likelihoodbased analysis of the multinomial probit model. Using a variant of the Gibbs sampler, an algorithm is developed to draw from the exact posterior of the multinomial probit model with correlated errors. This approach avoids direct evalu ..."
Abstract

Cited by 89 (4 self)
 Add to MetaCart
We develop new methods for conducting a finite sample, likelihoodbased analysis of the multinomial probit model. Using a variant of the Gibbs sampler, an algorithm is developed to draw from the exact posterior of the multinomial probit model with correlated errors. This approach avoids direct evaluation of the likelihood and, thus, avoids the problems associated with calculating choice probabilities which affect both the standard likelihood and method of simulated moments approaches. Both simulated and actual consumer panel data are used to fit sixdimensional choice models. We also develop methods for analyzing random coefficient and multiperiod probit models.
2001): “Clarify: Software for Interpreting and Presenting Statistical Results
 Journal of Statistical Software
"... and distribute this program provided that no charge is made and the copy is identical to the original. To request an exception, please contact Michael Tomz. Contents 1 ..."
Abstract

Cited by 63 (0 self)
 Add to MetaCart
and distribute this program provided that no charge is made and the copy is identical to the original. To request an exception, please contact Michael Tomz. Contents 1
Modeling and Generating Random Vectors with Arbitrary Marginal Distributions and Correlation Matrix
, 1997
"... We describe a model for representing random vectors whose component random variables have arbitrary marginal distributions and correlation matrix, and describe how to generate data based upon this model for use in a stochastic simulation. The central idea is to transform a multivariate normal random ..."
Abstract

Cited by 47 (3 self)
 Add to MetaCart
We describe a model for representing random vectors whose component random variables have arbitrary marginal distributions and correlation matrix, and describe how to generate data based upon this model for use in a stochastic simulation. The central idea is to transform a multivariate normal random vector into the desired random vector, so we refer to these vectors as having a NORTA (NORmal To Anything) distribution. NORTA vectors are most useful when the marginal distributions of the component random variables are neither identical nor from the same family of distributions, and they are particularly valuable when the dimension of the random vector is greater than two. Several numerical examples are provided. Keywords: simulation, random vector, input modeling, correlation matrix, copulas 1 Introduction In many stochastic simulations, simple input modelsidependent and identically distributed sequences from standard probability distributionsare not faithful representations of th...
Estimation of copulabased semiparametric time series models
 J. Econometrics
, 2006
"... This paper studies the estimation of a class of copulabased semiparametric stationary Markov models. These models are characterized by nonparametric invariant (or marginal) distributions and parametric copula functions that capture the temporal dependence of the processes; the implied transition di ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
This paper studies the estimation of a class of copulabased semiparametric stationary Markov models. These models are characterized by nonparametric invariant (or marginal) distributions and parametric copula functions that capture the temporal dependence of the processes; the implied transition distributions are all semiparametric. Models in this class are easy to simulate, and can be expressed as semiparametric regression transformation models. One advantage of this copula approach is to separate out the temporal dependence (such as tail dependence) from the marginal behavior (such as fat tailedness) of a time series. We present conditions under which processes generated by models in this class are βmixing; naturally, these conditions depend only on the copula specification. Simple estimators of the marginal distribution and the copula parameter are provided, and their asymptotic properties are established under easily verifiable conditions. Estimators of important features of the transition distribution such as the (nonlinear) conditional moments and conditional quantiles are easily obtained from estimators of the marginal distribution and the copula parameter; their √ n − consistency and asymptotic normality can be obtained using the Delta method. In addition, the semiparametric
Input Modeling Tools For Complex Problems
, 1998
"... A simulation model is composed of inputs and logic; the inputs represent the uncertainty or randomness in the system, while the logic determines how the system reacts to the uncertain elements. Simple input models, consisting of independent and identically distributed sequences of random variates fr ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
A simulation model is composed of inputs and logic; the inputs represent the uncertainty or randomness in the system, while the logic determines how the system reacts to the uncertain elements. Simple input models, consisting of independent and identically distributed sequences of random variates from standard probability distributions, are included in every commercial simulation language. Software to fit these distributions to data is also available. In this tutorial we describe input models that are useful when the input modeling problem is more complex.
Bayesian Reinforcement Learning in Continuous POMDPs with Application to Robot Navigation
"... We consider the problem of optimal control in continuous and partially observable environments when the parameters of the model are not known exactly. Partially Observable Markov Decision Processes (POMDPs) provide a rich mathematical model to handle such environments but require a known model to be ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
We consider the problem of optimal control in continuous and partially observable environments when the parameters of the model are not known exactly. Partially Observable Markov Decision Processes (POMDPs) provide a rich mathematical model to handle such environments but require a known model to be solved by most approaches. This is a limitation in practice as the exact model parameters are often difficult to specify exactly. We adopt a Bayesian approach where a posterior distribution over the model parameters is maintained and updated through experience with the environment. We propose a particle filter algorithm to maintain the posterior distribution and an online planning algorithm, based on trajectory sampling, to plan the best action to perform under the current posterior. The resulting approach selects control actions which optimally tradeoff between 1) exploring the environment to learn the model, 2) identifying the system’s state, and 3) exploiting its knowledge in order to maximize longterm rewards. Our preliminary results on a simulated robot navigation problem show that our approach is able to learn good models of the sensors and actuators, and performs as well as if it had the true model.
Automatic Random Variate Generation For Simulation Input
, 2000
"... We develop and evaluate algorithms for generating random variates for simulation input. One group called automatic, or blackbox algorithms can be used to sample from distributions with known density. They are based on the rejection principle. The hat function is generated automatically in a setup s ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We develop and evaluate algorithms for generating random variates for simulation input. One group called automatic, or blackbox algorithms can be used to sample from distributions with known density. They are based on the rejection principle. The hat function is generated automatically in a setup step using the idea of transformed density rejection. There the density is transformed into a concave function and the minimum of several tangents is used to construct the hat function. The resulting algorithms are not too complicated and are quite fast. The principle is also applicable to random vectors. A second group of algorithms is presented that generate random variates directly from a given sample by implicitly estimating the unknown distribution. The best of these algorithms are based on the idea of naive resampling plus added noise. These algorithms can be interpreted as sampling from the kernel density estimates. This method can be also applied to random vectors. There it can be interpreted as a mixture of naive resampling and sampling from the multinormal distribution that has the same covariance matrix as the data. The algorithms described in this paper have been implemented in ANSI C in a library called UNURAN which is available via anonymous ftp.
Chessboard Distributions and Random Vectors with Specified Marginals and Covariance Matrix
 Operations Research
, 2000
"... There is a growing need for the ability to specify and generate correlated random variables as primitive inputs to stochastic models. Motivated by this need, several authors have explored the generation of random vectors with specified marginals, together with a specified covariance matrix, through ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
There is a growing need for the ability to specify and generate correlated random variables as primitive inputs to stochastic models. Motivated by this need, several authors have explored the generation of random vectors with specified marginals, together with a specified covariance matrix, through the use of a transformation of a multivariate normal random vector. A covariance matrix is said to be feasible for a given set of marginal distributions if a random vector exists with these characteristics. We develop a computational approach for establishing whether a given covariance matrix is feasible for a given set of marginals. The approach is used to rigorously establish that there are sets of marginals with feasible covariance matrix that the normal transformation technique referred to above cannot match. An important feature of our analysis is that we show that for almost any covariance matrix (in a certain precise sense), our computational procedure either explicitly provides a c...