Results 1  10
of
10
A HeteroskedasticityConsistent Covariance Matrix Estimator And A Direct Test For Heteroskedasticity
, 1980
"... This paper presents a parameter covariance matrix estimator which is consistent even when the disturbances of a linear regression model are heteroskedastic. This estimator does not depend on a formal model of the structure of the heteroskedasticity. By comparing the elements of the new estimator ..."
Abstract

Cited by 3060 (5 self)
 Add to MetaCart
This paper presents a parameter covariance matrix estimator which is consistent even when the disturbances of a linear regression model are heteroskedastic. This estimator does not depend on a formal model of the structure of the heteroskedasticity. By comparing the elements of the new estimator to those of the usual covariance estimator, one obtains a direct test for heteroskedasticity, since in the absence of heteroskedasticity, the two estimators will be approximately equal, but will generally diverge otherwise. The test has an appealing least squares interpretation
A comparison of several procedures for the analysis of the nested regression model. 132
, 1971
"... ..."
Optimal variance estimation without estimating the mean function
, 2012
"... We study the least squares estimator in the residual variance estimation context. We show that the mean squared differences of paired observations are asymptotically normally distributed. We further establish that, by regressing the mean squared differences of these paired observations on the squar ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
We study the least squares estimator in the residual variance estimation context. We show that the mean squared differences of paired observations are asymptotically normally distributed. We further establish that, by regressing the mean squared differences of these paired observations on the squared distances between paired covariates via a simple least squares procedure, the resulting variance estimator is not only asymptotically normal and rootn consistent, but also reaches the optimal bound in terms of estimation variance. We also demonstrate the advantage of the least squares estimator in comparison with existing methods in terms of the second order asymptotic properties.
Best linear unbiased estimation in mixed models of the analysis of variance
 In Probability and Statistics Essays in Honor of Franklin A. Graybill
, 1988
"... A broad definition is given of balanced data in mixed models. For all such models, it is shown that the BLUE (best linear unbiased estimator) of an estimable function of the fixed effects is the same as the ordinary least squares estimator (OLSE). 1. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A broad definition is given of balanced data in mixed models. For all such models, it is shown that the BLUE (best linear unbiased estimator) of an estimable function of the fixed effects is the same as the ordinary least squares estimator (OLSE). 1.
Hypothesis testing for an exchangeable normal distribution
, 2000
"... ch ive of ..."
(Show Context)
Session VIIB: Sample Design and Estimation Discussant Comments by
"... In discussing the two papers in this session, “Sample size considerations for multilevel surveys” by Michael P. Cohen and “Two sided coverage intervals for small proportions based on survey data ” by Philip S. Kott, Per Gosta Andersson, and Olle Nerman, I will follow a format. I will first make some ..."
Abstract
 Add to MetaCart
In discussing the two papers in this session, “Sample size considerations for multilevel surveys” by Michael P. Cohen and “Two sided coverage intervals for small proportions based on survey data ” by Philip S. Kott, Per Gosta Andersson, and Olle Nerman, I will follow a format. I will first make some general comments. Then for each of these papers (which I will consider in
Journal of Statistical Planning and Inference 163 (2015) 1–20 Contents lists available at ScienceDirect Journal of Statistical Planning and Inference
"... journal homepage: www.elsevier.com/locate/jspi Differencebased variance estimation in nonparametric ..."
Abstract
 Add to MetaCart
journal homepage: www.elsevier.com/locate/jspi Differencebased variance estimation in nonparametric
10.1198/tast.2009.0011. Copyright restrictions may apply. A Reformulation of Weighted Least Squares Estimators
"... This article studies weighted, generalized, least squares estimators in simple linear regression with serially correlated errors. Closedform expressions of weighted least squares estimators and variances are presented under some common stationary autocorrelation settings, a firstorder autoregressi ..."
Abstract
 Add to MetaCart
(Show Context)
This article studies weighted, generalized, least squares estimators in simple linear regression with serially correlated errors. Closedform expressions of weighted least squares estimators and variances are presented under some common stationary autocorrelation settings, a firstorder autoregression and a firstorder movingaverage. These explicit expressions also have appealing applications, including an efficient weighted least squares computation method and a new sufficient and necessary condition on the equality of weighted least squares estimators and ordinary least squares estimators.