Results 1  10
of
122
Simultaneous Inference in General Parametric Models
, 2008
"... Simultaneous inference is a common problem in many areas of application. If multiple null hypotheses are tested simultaneously, the probability of rejecting erroneously at least one of them increases beyond the prespecified significance level. Simultaneous inference procedures have to be used which ..."
Abstract

Cited by 211 (6 self)
 Add to MetaCart
(Show Context)
Simultaneous inference is a common problem in many areas of application. If multiple null hypotheses are tested simultaneously, the probability of rejecting erroneously at least one of them increases beyond the prespecified significance level. Simultaneous inference procedures have to be used which adjust for multiplicity and thus control the overall type I error rate. In this paper we describe simultaneous inference procedures in general parametric models, where the experimental questions are specified through a linear combination of elemental model parameters. The framework described here is quite general and extends the canonical theory of multiple comparison procedures in ANOVA models to linear regression problems, generalized linear models, linear mixed effects models, the Cox model, robust linear models, etc. Several examples using a variety of different statistical models illustrate the breadth of the results. For the analyses we use the R addon package multcomp, which provides a convenient interface to the general approach adopted here. Key words: multiple tests, multiple comparisons, simultaneous confidence intervals, adjusted pvalues, multivariate normal distribution, robust statistics. 1
Parallel Computation of Multivariate Normal Probabilities
"... We present methods for the computation of multivariate normal probabilities on parallel/ distributed systems. After a transformation of the initial integral, an approximation can be obtained using MonteCarlo or quasirandom methods. We propose a metaalgorithm for asynchronous sampling methods and d ..."
Abstract

Cited by 207 (9 self)
 Add to MetaCart
We present methods for the computation of multivariate normal probabilities on parallel/ distributed systems. After a transformation of the initial integral, an approximation can be obtained using MonteCarlo or quasirandom methods. We propose a metaalgorithm for asynchronous sampling methods and derive efficient parallel algorithms for the computation of MVN distribution functions, including a method based on randomized Korobov and Richtmyer sequences. Timing results of the implementations using the MPI parallel environment are given. 1 Introduction The computation of the multivariate normal distribution function F (a; b) = j\Sigmaj \Gamma 1 2 (2) \Gamma n 2 Z b a e \Gamma 1 2 x \Sigma \Gamma1 x dx: (1) often leads to computationalintensive integration problems. Here \Sigma is an n \Theta n symmetric positive definite covariance matrix; furthermore one of the limits in each integration variable may be infinite. Genz [5] performs a sequence of transformations resu...
Capacity bounds via duality with applications to multipleantenna systems on flatfading channels
 IEEE Trans. Inform. Theory
, 2003
"... A general technique is proposed for the derivation of upper bounds on channel capacity. The technique is based on a dual expression for channel capacity where the maximization (of mutual information) over distributions on the channel input alphabet is replaced with a minimization (of average relativ ..."
Abstract

Cited by 147 (39 self)
 Add to MetaCart
A general technique is proposed for the derivation of upper bounds on channel capacity. The technique is based on a dual expression for channel capacity where the maximization (of mutual information) over distributions on the channel input alphabet is replaced with a minimization (of average relative entropy) over distributions on the channel output alphabet. Every choice of an output distribution — even if not the channel image of some input distribution — leads to an upper bound on mutual information. The proposed approach is used in order to study multiantenna flat fading channels with memory where the realization of the fading process is unknown at the transmitter and unknown (or only partially known) at the receiver. It is demonstrated that, for high signaltonoise ratio (SNR), the capacity of such channels typically grows only doublelogarithmically in the SNR. This is in stark contrast to the case with perfect receiver side information where capacity grows logarithmically in the SNR. To better understand this phenomenon
A note on pseudolikelihood constructed from marginal densities
 Biometrika
, 2004
"... marginal densities ..."
HighResolution Random Mesh Algorithms for Creating a Probabilistic 3D Surface Atlas of the Human Brain
, 1996
"... This paper describes the design, implementation, and results of a technique for creating a threedimensional (3D) probabilistic surface atlas of the human brain. We have developed, implemented, and tested a new 3D statistical method for assessing structural variations in a database of anatomic image ..."
Abstract

Cited by 73 (19 self)
 Add to MetaCart
This paper describes the design, implementation, and results of a technique for creating a threedimensional (3D) probabilistic surface atlas of the human brain. We have developed, implemented, and tested a new 3D statistical method for assessing structural variations in a database of anatomic images. The algorithm enables the internal surface anatomy of new subjects to be analyzed at an extremely local level. The goal was to quantify subtle and distributed patterns of deviation from normal anatomy by automatically generating detailed probability maps of the anatomy of new subjects. Connected systems of parametric meshes were used to model the internal course of the following structures in both hemispheres: the parietooccipital sulcus, the anterior and posterior rami of the calcarine sulcus, the cingulate and marginal sulci, and the supracallosal sulcus. These sulci penetrate sufficiently deeply into the brain to introduce an obvious topological decomposition of its volume architecture. A family of surface maps was constructed, encoding statistical properties of local anatomical variation within individual sulci. A probability space of random transformations, based on the theory of Gaussian random fields, was developed to reflect the observed variability in stereotaxic space of the connected system of anatomic surfaces. A complete system of probability density functions was computed, yielding confidence limits on surface variation. The ultimate goal of brain mapping is to provide a framework for integrating functional and anatomical data across many subjects and modalities. This task requires precise quantitative knowledge of the variations in geometry and location of intracerebral structures and critical functional interfaces. The surface mapping and probabilistic tec...
Modeling and Generating Random Vectors with Arbitrary Marginal Distributions and Correlation Matrix
, 1997
"... We describe a model for representing random vectors whose component random variables have arbitrary marginal distributions and correlation matrix, and describe how to generate data based upon this model for use in a stochastic simulation. The central idea is to transform a multivariate normal random ..."
Abstract

Cited by 70 (4 self)
 Add to MetaCart
We describe a model for representing random vectors whose component random variables have arbitrary marginal distributions and correlation matrix, and describe how to generate data based upon this model for use in a stochastic simulation. The central idea is to transform a multivariate normal random vector into the desired random vector, so we refer to these vectors as having a NORTA (NORmal To Anything) distribution. NORTA vectors are most useful when the marginal distributions of the component random variables are neither identical nor from the same family of distributions, and they are particularly valuable when the dimension of the random vector is greater than two. Several numerical examples are provided. Keywords: simulation, random vector, input modeling, correlation matrix, copulas 1 Introduction In many stochastic simulations, simple input modelsidependent and identically distributed sequences from standard probability distributionsare not faithful representations of th...
Portfolio ValueatRisk with HeavyTailed Risk Factors,” Mathematical Finance 12
, 2002
"... This paper develops efficient methods for computing portfolio valueatrisk (VAR) when the underlying risk factors have a heavytailed distribution. In modeling heavy tails, we focus on multivariate t distributions and some extensions thereof. We develop two methods for VAR calculation that exploit ..."
Abstract

Cited by 63 (2 self)
 Add to MetaCart
(Show Context)
This paper develops efficient methods for computing portfolio valueatrisk (VAR) when the underlying risk factors have a heavytailed distribution. In modeling heavy tails, we focus on multivariate t distributions and some extensions thereof. We develop two methods for VAR calculation that exploit a quadratic approximation to the portfolio loss, such as the deltagamma approximation. In the first method, we derive the characteristic function of the quadratic approximation and then use numerical transform inversion to approximate the portfolio loss distribution. Because the quadratic approximation may not always yield accurate VAR estimates, we also develop a low variance Monte Carlo method. This method uses the quadratic approximation to guide the selection of an effective importance sampling distribution that samples risk factors so that large losses occur more often. Variance is further reduced by combining the importance sampling with stratified sampling. Numerical results on a variety of test portfolios indicate that large variance reductions are typically obtained. Both methods developed in this paper overcome difficulties associated with VAR calculation with heavytailed risk factors. The Monte Carlo method also extends to the problem of estimating the conditional excess, sometimes known as the conditional VAR.
Numerical Computation of Rectangular Bivariate And Trivariate normal and t probabilities
 STATISTICS AND COMPUTING
, 2004
"... Algorithms for the computation of bivariate and trivariate normal and t probabilities for rectangles are reviewed. The algorithms use numerical integration to approximate transformed probability distribution integrals. A generalization of Plackett's formula is derived for bivariate and trivaria ..."
Abstract

Cited by 54 (1 self)
 Add to MetaCart
Algorithms for the computation of bivariate and trivariate normal and t probabilities for rectangles are reviewed. The algorithms use numerical integration to approximate transformed probability distribution integrals. A generalization of Plackett's formula is derived for bivariate and trivariate t probabilities. New methods are described for the numerical computation of bivariate and trivariate t probabilities. Test results are provided, along with recommendations for the most efficient algorithms for single and double precision computations.
Frequency Assignment Problems
 HANDBOOK OF COMBINATORIAL OPTIMIZATION
, 1999
"... The ever growing number of wireless communications systems deployed around the globe have made the optimal assignment of a limited radio frequency spectrum a problem of primary importance. At issue are planning models for permanent spectrum allocation, licensing, regulation, and network design. Furt ..."
Abstract

Cited by 42 (3 self)
 Add to MetaCart
The ever growing number of wireless communications systems deployed around the globe have made the optimal assignment of a limited radio frequency spectrum a problem of primary importance. At issue are planning models for permanent spectrum allocation, licensing, regulation, and network design. Further at issue are online algorithms for dynamically assigning frequencies to users within an established network. Applications include aeronautical mobile, land mobile, maritime mobile, broadcast, land fixed (pointto point), and satellite systems. This paper surveys research conducted by theoreticians, engineers, and computer scientists regarding the frequency assignment problem (FAP) in all of its guises. The paper begins by defining some of the more common types of FAPs. It continues with a discussion on measures of optimality relating to the use of spectrum, models of interference, and mathematical representations of the many FAPs, both in graph theoretic terms, and as mathematical pro...