Results 1  10
of
75
Limited information estimators and exogeneity tests for simultaneous probit models
, 1988
"... A twostep maximum likelihood procedure is proposed for estimating simultaneous probit models and is compared to alternative limited information estimators. Conditions under which each estimator attains the CramerRao lower bound are obtained. Simple tests for exogeneity based on the new twostep es ..."
Abstract

Cited by 310 (0 self)
 Add to MetaCart
A twostep maximum likelihood procedure is proposed for estimating simultaneous probit models and is compared to alternative limited information estimators. Conditions under which each estimator attains the CramerRao lower bound are obtained. Simple tests for exogeneity based on the new twostep estimator are proposed and are shown to be asymptotically equivalent to one another and to have the same local asymptotic power as classical tests based on the limited information maximum likelihood estimator. Finite sample comparisons between the new and alternative estimators are presented based on some Monte Carlo evidence. The performance of the proposed tests for exogeneity is also assessed.
Stock Prices and Volume
, 1990
"... We undertake a comprehensive investigation of price and volume comovement using daily New York Stock Exchange data from 1928 to 1987. We adjust the data to take into account wellknown calendar effects and longrun trends. To describt tbe process, we use a seminonparametric estimate of the joint de ..."
Abstract

Cited by 144 (10 self)
 Add to MetaCart
We undertake a comprehensive investigation of price and volume comovement using daily New York Stock Exchange data from 1928 to 1987. We adjust the data to take into account wellknown calendar effects and longrun trends. To describt tbe process, we use a seminonparametric estimate of the joint density of current price change and volume conditional on past price changes and volume. Four empirical regularities are found: 1) positive correlation between conditional volatility and volume, 2) large price movements are followed by high volume, 3) conditioning on lagged volume substantially attenuates the "leverage " effect, and 4) after conditioning on lagged volume, there is a positive risk/return relation.
Information theoretic approaches to inference in moment condition models
 Econometrica
, 1998
"... ..."
(Show Context)
Nonparametric Density Estimation and Tests of Continuous Time Interest Rate Models
 Review of Financial Studies
, 1998
"... A number of recent papers have used nonparametric density estimation or nonparametric regression to study the instantaneous spot interest rate, and to test term structure models. However, little is known about the performance of these methods when applied to persistent timeseries, such as U.S. inte ..."
Abstract

Cited by 86 (2 self)
 Add to MetaCart
A number of recent papers have used nonparametric density estimation or nonparametric regression to study the instantaneous spot interest rate, and to test term structure models. However, little is known about the performance of these methods when applied to persistent timeseries, such as U.S. interest rates. This paper uses the Vasicek [1977] model to study the performance of kernel density estimates of the ergodic distribution of the instantaneous spot rate. The model's tractability allows me to analyze the MISE of the kernel estimate as a function of persistence, variance of the ergodic distribution, span of the data, sampling frequency, and kernel bandwidth. Our principle result is that persistence has an important impact on optimal bandwidth selection and on nite sample performance. We also nd that sampling the data more frequently has little e ect on estimator quality. We also examine one of AitSahalia's [1996a] new nonparametric tests of parametric continuoustime Markov models of the instantaneous spot interest rate. The test is based on the distance between parametric and nonparametric (kernel) estimates of the ergodic distribution of the interest rate process. Our principal result is that the test rejects too often when using asymptotic critical values and 22 years of data. The reason for the high rejection rate is probably because the asymptotic distribution of the test does not depend on persistence, but the nite sample performance of the estimator does. After critical values are adjusted for size, the test has low power in distinguishing between the Vasicek and CoxIngersollRoss models when compared with a conditional moment based speci cation test.
Consistent Specification Testing With Nuisance Parameters Present Only Under The Alternative
, 1995
"... . The nonparametric and the nuisance parameter approaches to consistently testing statistical models are both attempts to estimate topological measures of distance between a parametric and a nonparametric fit, and neither dominates in experiments. This topological unification allows us to greatly ex ..."
Abstract

Cited by 66 (11 self)
 Add to MetaCart
. The nonparametric and the nuisance parameter approaches to consistently testing statistical models are both attempts to estimate topological measures of distance between a parametric and a nonparametric fit, and neither dominates in experiments. This topological unification allows us to greatly extend the nuisance parameter approach. How and why the nuisance parameter approach works and how it can be extended bears closely on recent developments in artificial neural networks. Statistical content is provided by viewing specification tests with nuisance parameters as tests of hypotheses about Banachvalued random elements and applying the Banach Central Limit Theorem and Law of Iterated Logarithm, leading to simple procedures that can be used as a guide to when computationally more elaborate procedures may be warranted. 1. Introduction In testing whether or not a parametric statistical model is correctly specified, there are a number of apparently distinct approaches one might take. T...
Working During School and Academic Performance
, 2000
"... this paper, we utilize unique new data in an attempt to examine the extent to which the endogeneity of hours may bias estimates of the effect of employment on academic performance. The data are obtained directly from the administrative records of Berea College. Located in central Kentucky, this libe ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
this paper, we utilize unique new data in an attempt to examine the extent to which the endogeneity of hours may bias estimates of the effect of employment on academic performance. The data are obtained directly from the administrative records of Berea College. Located in central Kentucky, this liberal arts institution operates under a mission of providing an education to those who "have great promise, but limited economic resources." As part of this mission, all students who attend Berea receive full tuition scholarships. Part of the cost of schooling is defrayed through a mandatory workstudy program. Although all students must work at least a minimum often hours a week, variation in hours worked arises because students can often choose to earn extra income by working additional hours. We wish to note in advance that, given the unique nature of Berea College, it is our belief that our results should be viewed cautiously. Nonetheless, as will be described throughout the paper, the institutional details of the Berea College labor program and the detailed nature of our administrative data 3See Bound et al. (1995) for a discussion of the potential problems that can arise in instrumental variables estimation when the correlation between instruments and the endogenous explanatory variable is weak
Large Scale Conditional Covariance Matrix Modeling, Estimation and Testing,” University of California at San Diego working paper
, 1994
"... A new representation of the diagonal Vech model is given using the Hadamard product. Sufficient conditions on parameter matrices are provided to ensure the positive definiteness of covariance matrices from the new representation. Based on this, some new and simple models are discussed. A set of diag ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
A new representation of the diagonal Vech model is given using the Hadamard product. Sufficient conditions on parameter matrices are provided to ensure the positive definiteness of covariance matrices from the new representation. Based on this, some new and simple models are discussed. A set of diagnostic tests for multivariate ARCH models is proposed. The tests are able to detect various model misspecifications by examing the orthogonality of the squared normalized residuals. A small MonteCarlo study is carried out to check the small sample performance of the test. An empirical example is also given as guidance for model estimation and selection in the multivariate framework. For the specific data set considered, it is found that the simple one and two parameter models and the constant conditional correlation model perform fairly well.
INVERSE PROBABILITY WEIGHTED ESTIMATION FOR GENERAL MISSING DATA PROBLEMS
"... I study inverse probability weighted Mestimation under a general missing data scheme. Examples include Mestimation with missing data due to a censored survival time, propensity score estimation of the average treatment effect in the linear exponential family, and variable probability sampling with ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
I study inverse probability weighted Mestimation under a general missing data scheme. Examples include Mestimation with missing data due to a censored survival time, propensity score estimation of the average treatment effect in the linear exponential family, and variable probability sampling with observed retention frequencies. I extend an important result known to hold in special cases: estimating the selection probabilities is generally more efficient than if the known selection probabilities could be used in estimation. For the treatment effect case, the setup allows a general characterization of a “double robustness ” result due to Scharfstein, Rotnitzky, and Robins (1999).