Results 1  10
of
26
Diagnostic Measures for Model Criticism
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1996
"... ... In this article we present the general outlook and discuss general families of elaborations for use in practice; the exponential connection elaboration plays a key role. We then describe model elaborations for use in diagnosing: departures from normality, goodness of fit in generalized linear mo ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
... In this article we present the general outlook and discuss general families of elaborations for use in practice; the exponential connection elaboration plays a key role. We then describe model elaborations for use in diagnosing: departures from normality, goodness of fit in generalized linear models, and variable selection in regression and outlier detection. We illustrate our approach with two applications.
Statistical ecology and environmental statistics for costeffective ecological synthesis and environmental analysis
 In Modern Trends in Ecology and
, 1998
"... Ecology is undergoing some major changes in response to changing times of societal concerns coupled with remote sensing information and computer technology. Both theoretical and applied ecology are using more of statistical thought processes and procedures with advancing software and hardware to sat ..."
Abstract

Cited by 10 (10 self)
 Add to MetaCart
Ecology is undergoing some major changes in response to changing times of societal concerns coupled with remote sensing information and computer technology. Both theoretical and applied ecology are using more of statistical thought processes and procedures with advancing software and hardware to satisfy public policy and research, variously incorporating sample survey data, intensive sitespecific data, and remote sensing image data. Statistical ecology and environmental statistics have numerous challenges and opportunities in the waiting for the twentyfirst century. This paper shares some of the highlights in statistical ecology, environmental statistics, and ecological assessment in this connection.
Modelling Time Series Count Data: An Autoregressive Conditional Poisson Model
, 2000
"... This paper introduces new models for time series count data. The Autoregressive Conditional Poisson model (ACP) makes it possible to deal with issues of discreteness, overdispersion (variance greater than the mean) and serial correlation. A fully parametric approach istaken and a marginal distributi ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
This paper introduces new models for time series count data. The Autoregressive Conditional Poisson model (ACP) makes it possible to deal with issues of discreteness, overdispersion (variance greater than the mean) and serial correlation. A fully parametric approach istaken and a marginal distribution for the counts is specified, where conditional on past observations the mean is autoregressive. This enables to attain improved inference on coefficients of exogenous regressors relative to static Poisson regression, which is the main concern of the existing literature, while modeling the serial correlation in a exible way. Avariety of models, based on the double Poisson distribution of Efron (1986) is introduced, which in a first step introduce an additional dispersion parameter and in a second step make this dispersion parameter timevarying. All models are estimated using maximum likelihood which makes the usual tests available. In this framework autocorrelation can be tested with a straightforward likelihood ratio test, whose simplicity is in sharp contrast with test procedures in the latent variable time series count model of Zeger (1988). The models are applied to the time series of monthly polio cases in the U.S between 1970 and 1983 as well as to the daily number of price change durations of:75 $ on the IBM stock. A:75 $ pricechange duration is defined as the time it takes the stock price to move by at least:75$. The variable of interest is the daily number of such durations, which is a measure of intradaily volatility, since the more volatile the stock price is within a day, the larger the counts will be. The ACP models provide good density forecasts of this measure of volatility.
Overdispersed Generalized Linear Models
 J. STATIST. PLANNING AND INFERENCE
, 1997
"... Generalized linear models have become a standard class of models for data analysts. However in some applications, heterogeneity in samples is too great to be explained by the simple variance function implicit in such models. Utilizing a two parameter exponential family which is overdispersed relativ ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Generalized linear models have become a standard class of models for data analysts. However in some applications, heterogeneity in samples is too great to be explained by the simple variance function implicit in such models. Utilizing a two parameter exponential family which is overdispersed relative to a specified one parameter exponential family enables the creation of classes of overdispersed generalized linear models (OGLM's) which are analytically attractive. We propose fitting such models within a Bayesian framework employing noninformative priors in order to let the data drive the inference. Hence our analysis approximates likelihoodbased inference but with possibly more reliable estimates of variability for small sample sizes. Bayesian calculations are carried out using a MetropoliswithinGibbs sampling algorithm. An illustrative example using a data set involving damage incidents to cargo ships is presented. Details of the data analysis are provided including comparison with...
Univariate and Bivariate Loglinear Models for Discrete Test Score Distributions
, 2000
"... The welldeveloped theory of exponential families of distributions is applied to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. These models are powerful tools for many forms of parametric data smoothi ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The welldeveloped theory of exponential families of distributions is applied to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. These models are powerful tools for many forms of parametric data smoothing and are particularly wellsuited to problems in which there is little or no theory to guide a choice of probability models, e.g., smoothing a distribution to eliminate roughness and zero frequencies in order to equate scores from different tests. Attention is given to efficient computation of the maximum likelihood estimates of the parameters using Newton's Method and to computationally efficient methods for obtaining the asymptotic standard errors of the fitted frequencies and proportions. We discuss tools that can be used to diagnose the quality of the fitted frequencies for both the univariate and the bivariate cases. Five examples, using real data, are used to illustrate the methods of this paper.
Semiparametric estimation of mean and variance functions for nongaussian data
 Computational Statistics
, 1996
"... There are many approaches to flexible semiparametric estimation of a mean function in regression modelling. However, less attention has been devoted to flexible methods for simultaneously estimating both a mean and variance function. Flexible modelling of the response variance in regression is inter ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
There are many approaches to flexible semiparametric estimation of a mean function in regression modelling. However, less attention has been devoted to flexible methods for simultaneously estimating both a mean and variance function. Flexible modelling of the response variance in regression is interesting for understanding the causes of variability in the responses, and is crucial for efficient estimation and correct inference for mean parameters. In this paper we describe methods for mean and variance estimation where the responses are modelled using the double exponential family of distributions and mean and dispersion parameters are described as an additive function of predictors. The additive terms in the model are represented by penalized splines. We carry out inference in a Bayesian way, integrating out nuisance parameters so that selection of smoothing parameters is not required. A simple and unified computational methodology is presented for carrying out the calculations required for Bayesian inference in this class of models based on an adaptive Metropolis algorithm. Application of the adaptive Metropolis algorithm is fully automatic and does not require any kind of pretuning runs. The methodology presented provides flexible methods for modelling heterogeneous Gaussian data, as well as overdispersed and underdispersed count data. Performance is considered in a variety of examples involving real and simulated data sets.
Bayesian Approaches for Overdispersion in Generalized Linear Models
, 1998
"... Generalized linear models (GLM's) have been routinely used in statistical data analysis. The evolution of these models as well as details regarding model fitting, model checking and inference is thoroughly documented in McCullagh and Nelder (1989). However, in many applications, heterogeneity in the ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Generalized linear models (GLM's) have been routinely used in statistical data analysis. The evolution of these models as well as details regarding model fitting, model checking and inference is thoroughly documented in McCullagh and Nelder (1989). However, in many applications, heterogeneity in the observed samples is too large to be explained by the simple variance function which is implicit in GLM's. To overcome this, several parametric and nonparametric approaches for creating overdispersed generalized linear models (OGLM's) were developed. In this article, we summarize recent approaches to OGLM's, with special emphasis given to the Bayesian framework. We also discuss computational aspects of Bayesian model fitting, model determination and inference through examples. 1 Introduction Generalized linear models (GLM) are a standard class of models in contemporary statistical data analysis (McCullagh and Nelder 1989). The widely available GLIM software as well as SPlus facilitate compu...
SUMMARY
, 1997
"... This is the second part of the numerical investigation on a low emissions staged turbine combustor (STC) using a modified version of the KIVAII code. The main focus of this study is the numerical analysis of the reacting fluid flow and heat transfer inside the quickquench/leancombustion (QQ/LC) z ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This is the second part of the numerical investigation on a low emissions staged turbine combustor (STC) using a modified version of the KIVAII code. The main focus of this study is the numerical analysis of the reacting fluid flow and heat transfer inside the quickquench/leancombustion (QQ/LC) zones. In the QQ zone, cool dilution air was injected into the hot mixture through eight 45 ° inclined slots with a momentum flux ratio of 60. The slot aspect ratio was 6 and the jettomainstream mass flow rate ratio was 3. The inlet conditions of the QQ zone were obtained from the results of rich combustion zone analysis described in part I. A tension spline interpolation scheme was then used to interpolate the necessary information needed at the inlet. Conditions at the slot opening (dilution jet) were chosen closely related to those encountered in advanced combustion systems. The Grid system needed for the numerical solutions was generated by a transfinite interpolation scheme. KIVAII was further modified for the current study. Preliminary results illustrate some of the major features of the flow and temperature fields inside the QQ/LC zones. Formation of the co and counterrotating bulk flow and the sandwichedringshape temperature field, typical of the confined inclined jetincross flow, can be seen clearly and is consistent with experimental observations. The calculations of the massweighted standard deviation and the pattern factor of temperature revealed that the mixing performance of the STC combustor is very promising. The temperature of the fluid leaving the LC zone is very
Overdispersion Diagnostics for Generalized Linear Models
, 1993
"... Generalized linear models (GLMs) are simple, convenient models for count data, but they assume that the variance is a specified function of the mean. Although overdispersed GLMs allow more flexible mean  variance relationships, they are often not as simple to interpret nor as easy to fit as standa ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Generalized linear models (GLMs) are simple, convenient models for count data, but they assume that the variance is a specified function of the mean. Although overdispersed GLMs allow more flexible mean  variance relationships, they are often not as simple to interpret nor as easy to fit as standard GLMs. This paper introduces a convexity plot, or Cplot for short, that detects overdispersion and relative variance curves and relative variance tests that help to understand the nature of the overdispersion. Convexity plots sometimes detect overdispersion better than score tests, and relative variance curves and tests sometimes distinguish the source of the overdispersion better than score tests. Keywords Mixture, random coefficient, residuals, score tests, variance inflation 1 INTRODUCTION Convenient generalized linear models (GLMs) for count data, such as logistic regression and loglinear Poisson regression, require the variance to be a known function of the mean. But count data ...
2003, Multivariate modeling of time series count data: An autoregressive conditional poisson model, CORE Discussion Paper 2003/25
"... This paper introduces a new multivariate model for time series count data. The Multivariate Autoregressive Conditional Poisson model (MACP) makes it possible to deal with issues of discreteness, overdispersion (variance greater than the mean) and both auto and crosscorrelation. We model counts as ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper introduces a new multivariate model for time series count data. The Multivariate Autoregressive Conditional Poisson model (MACP) makes it possible to deal with issues of discreteness, overdispersion (variance greater than the mean) and both auto and crosscorrelation. We model counts as Poisson or double Poisson and assume that conditionally on past observations the means follow a Vector Autoregression. We use a copula to introduce contemporaneous correlation between the series. An important advantage of this model is that it can accommodate both positive and negative correlation among variables. As a feasible alternative to multivariate duration models, the model is applied to the submission of market orders and quote revisions on IBM on the New York Stock Exchange. We show that a single factor cannot explain the dynamics of the market process, which confirms that time deformation, taken as meaning that all market events should accelerate or slow down proportionately, does not hold. We advocate the use of the Multivariate Autoregressive Conditional Poisson model for the study of multivariate point processes in finance, when the number of variables considered simultaneously exceeds two and looking at durations becomes too difficult. ∗ The authors would like to thank Luc Bauwens, Philippe Lambert and David Veredas for helpful discussions and suggestions. The usual disclaimers apply. 1 1