Results 1  10
of
295
Stock Market Prices Do Not Follow Random Walks: Evidence from a Simple Specification Test
 REVIEW OF FINANCIAL STUDIES
, 1988
"... In this article we test the random walk hypothesis for weekly stock market returns by comparing variance estimators derived from data sampled at different frequencies. The random walk model is strongly rejected for the entire sample period (19621985) and for all subperiod for a variety of aggrega ..."
Abstract

Cited by 226 (13 self)
 Add to MetaCart
In this article we test the random walk hypothesis for weekly stock market returns by comparing variance estimators derived from data sampled at different frequencies. The random walk model is strongly rejected for the entire sample period (19621985) and for all subperiod for a variety of aggregate returns indexes and sizesorted portofolios. Although the rejections are due largely to the behavior of small stocks, they cannot be attributed completely to the effects of infrequent trading or timevarying volatilities. Moreover, the rejection of the random walk for weekly returns does not support a meanreverting model of asset prices.
Testing for Common Trends
 Journal of the American Statistical Association
, 1988
"... Cointegrated multiple time series share at least one common trend. Two tests are developed for the number of common stochastic trends (i.e., for the order of cointegration) in a multiple time series with and without drift. Both tests involve the roots of the ordinary least squares coefficient matrix ..."
Abstract

Cited by 208 (5 self)
 Add to MetaCart
Cointegrated multiple time series share at least one common trend. Two tests are developed for the number of common stochastic trends (i.e., for the order of cointegration) in a multiple time series with and without drift. Both tests involve the roots of the ordinary least squares coefficient matrix obtained by regressing the series onto its first lag. Critical values for the tests are tabulated, and their power is examined in a Monte Carlo study. Economic time series are often modeled as having a unit root in their autoregressive representation, or (equivalently) as containing a stochastic trend. But both casual observation and economic theory suggesthat many series might contain the same stochastic trendso that they are cointegrated. If each of n series is integrated of order 1 but can be jointly characterized by k < n stochastic trends, then the vecto representation of these series has k unit roots and n k distinct stationary linear combinations. Our proposed tests can be viewed alternatively as tests of the number of common trends, linearly independent cointegrating vectors, or autoregressive unit roots of the vector process. Both of the proposed tests are asymptotically similar. The firstest (qf) is developed under the assumption that certain components of the process have a finiteorder vector autoregressive (VAR) representation, and the nuisance parameters are handled by estimating this VAR. The second test (q,) entails computing the eigenvalues of a corrected sample firstorder autocorrelation matrix, where the correction is essentially a sum of the autocovariance matrices. Previous researchers have found that U.S. postwar interest rates, taken individually, appear to be integrated of order 1. In addition, the theory of the term structure implies that yields on similar assets of different maturities will be cointegrated. Applying these tests to postwar U.S. data on the federal funds rate and the three and twelvemonth treasury bill rates providesupport for this prediction: The three interest rates appear to be cointegrated.
Inference in Linear Time Series Models with Some Unit Roots," Econometrica
, 1990
"... This paper considers estimation and hypothesis testing in linear time series models when some or all of the variables have unit roots. Our motivating example is a vector autoregression with some unit roots in the companion matrix, which might include polynomials in time as regressors. In the general ..."
Abstract

Cited by 158 (6 self)
 Add to MetaCart
This paper considers estimation and hypothesis testing in linear time series models when some or all of the variables have unit roots. Our motivating example is a vector autoregression with some unit roots in the companion matrix, which might include polynomials in time as regressors. In the general formulation, the variable might be integrated or cointegrated of arbitrary orders, and might have drifts as well. We show that parameters that can be written as coefficients on mean zero, nonintegrated regressors have jointly normal asymptotic distributions, converging at the rate T'/2. In general, the other coefficients (including the coefficients on polynomials in time) will have nonnormal asymptotic distributions. The results provide a formal characterization of which t or F testssuch as Granger causality testswill be asymptotically valid, and which will have nonstandard limiting distributions.
Lag length selection and the construction of unit root tests with good size and power
 Econometrica
, 2001
"... It is widely known that when there are errors with a movingaverage root close to −1, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and the BIC tend to select a truncation lag (k) that is very small. We conside ..."
Abstract

Cited by 150 (13 self)
 Add to MetaCart
It is widely known that when there are errors with a movingaverage root close to −1, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and the BIC tend to select a truncation lag (k) that is very small. We consider a class of Modified Information Criteria (MIC) with a penalty factor that is sample dependent. It takes into account the fact that the bias in the sum of the autoregressive coefficients is highly dependent on k and adapts to the type of deterministic components present. We use a local asymptotic framework in which the movingaverage root is local to −1 to document how the MIC performs better in selecting appropriate values of k. In montecarlo experiments, the MIC is found to yield huge size improvements to the DF GLS and the feasible point optimal PT test developed in Elliott, Rothenberg and Stock (1996). We also extend the M tests developed in Perron and Ng (1996) to allow for GLS detrending of the data. The MIC along with GLS detrended data yield a set of tests with desirable size and power properties.
Univariate detrending methods with stochastic trends, H.I.E.R. discussion paper no
, 1985
"... This paper discusses detrending economic time series, when the trend is modelled as a stochastic process. It considers unobserved components models in which the observed series is decomposed into a trend (a random walk with drift) and a residual stationary component. Optimal detrending methods are d ..."
Abstract

Cited by 77 (1 self)
 Add to MetaCart
This paper discusses detrending economic time series, when the trend is modelled as a stochastic process. It considers unobserved components models in which the observed series is decomposed into a trend (a random walk with drift) and a residual stationary component. Optimal detrending methods are discussed, as well as problems associated with using these detrended data in regression models. The methods are applied to three time series: GNP, disposable income, and consumption expenditures. The detrended data are used to test a version of the Life Cycle consumption model. 1.
On the importance of measuring payout yield: Implications for empirical asset pricing
 Journal of Finance
, 2006
"... We investigate the empirical implications of using various measures of payout yield rather than dividend yield for asset pricing models. We find statistically and economically significant predictability in the time series when payout (dividends plus repurchases) and net payout (dividends plus repurc ..."
Abstract

Cited by 52 (6 self)
 Add to MetaCart
We investigate the empirical implications of using various measures of payout yield rather than dividend yield for asset pricing models. We find statistically and economically significant predictability in the time series when payout (dividends plus repurchases) and net payout (dividends plus repurchases minus issuances) yields are used instead of the dividend yield. Similarly, we find that payout (net payout) yields contains information about the cross section of expected stock returns exceeding that of dividend yields, and that the high minus low payout yield portfolio is a priced factor. WHILE THE IRRELEVANCE THEOREM of Miller and Modigliani (1961) implies that there is no reason to suspect that dividends play a role in determining equity price levels or equity returns, the theorem is silent on the usefulness of dividends in explaining these variables. It is then, perhaps, not surprising that there is a considerable literature exploiting the properties of dividends and dividend yields to better understand the fundamentals of asset pricing both in the time series and in the cross section. Motivation for the former comes from variations of the Gordon growth model in which dividend yields can be written as the return minus the dividend’s growth rate (see, e.g., Fama and French (1988)), from consumptionbased asset pricing models in which the firm’s dividends covary with aggregate consumption (e.g., Lucas (1978) and Shiller (1981)), and so forth. Additional motivation comes from crosssectional heterogeneity in tax, agency, and asymmetric information considerations (e.g.,
Variable trends in economic time series
 J. Econom. Perspectives
, 1988
"... T he two most striking historical features of aggregate output are its sustained long run growth and its recurrent fluctuations around this growth path. Real per capita GNP, consumption and investment in the United States during the postwar era are plotted in Figure 1. Both growth and deviations fro ..."
Abstract

Cited by 51 (1 self)
 Add to MetaCart
T he two most striking historical features of aggregate output are its sustained long run growth and its recurrent fluctuations around this growth path. Real per capita GNP, consumption and investment in the United States during the postwar era are plotted in Figure 1. Both growth and deviations from the growth trendoften referred to as "business cycles"are apparent in each series. Over horizons of a few years, these shorter cyclical swings can be pronounced; for example, the 1953, 1957 and 1974 recessions are evident as substantial temporary declines in aggregate activity. These cyclical fluctuations are, however, dwarfed in magnitude by the secular expansion of output. But just as there are cyclical swings in output, so too are there variations in the growth trend: growth in GNP in the 1960s was much stronger than it was in the 1950s. Thus, changes in long run patterns of growth are an important feature of postwar aggregate economic activity. In this article we discuss the implications of changing trends in macroeconomic data from two perspectives. The first perspective is that of a macroeconomist reassessing the conventional dichotomy between growth and stabilization policies. As an
A PANIC Attack on Unit Roots and Cointegration
, 2003
"... This paper develops a new methodology that makes use of the factor structure of large dimensional panels to understand the nature of nonstationarity in the data. We refer to it as PANIC – a ‘Panel Analysis of Nonstationarity in Idiosyncratic and Common components’. PANIC consists of univariate and ..."
Abstract

Cited by 47 (2 self)
 Add to MetaCart
This paper develops a new methodology that makes use of the factor structure of large dimensional panels to understand the nature of nonstationarity in the data. We refer to it as PANIC – a ‘Panel Analysis of Nonstationarity in Idiosyncratic and Common components’. PANIC consists of univariate and panel tests with a number of novel features. It can detect whether the nonstationarity is pervasive, or variablespecific, or both. It tests the components of the data instead of the observed series. Inference is therefore more accurate when the components have different orders of integration. PANIC also permits the construction of valid panel tests even when crosssection correlation invalidates pooling of statistics constructed using the observed data. The key to PANIC is consistent estimation of the components even when the regressions are individually spurious. We provide a rigorous theory for estimation and inference. In Monte Carlo simulations, the tests have very good size and power. PANIC is applied to a panel of inflation series.
Predictive Regressions: A ReducedBias Estimation Method, working paper
, 2003
"... We propose a direct and convenient reducedbias estimator of predictive regression coefficients, assuming that the regressors are Gaussian firstorder autoregressive with errors that are correlated with the error series of the dependent variable. For the singleregressor model, Stambaugh (1999) show ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
We propose a direct and convenient reducedbias estimator of predictive regression coefficients, assuming that the regressors are Gaussian firstorder autoregressive with errors that are correlated with the error series of the dependent variable. For the singleregressor model, Stambaugh (1999) shows that the ordinary least squares estimator of the predictive regression coefficient is biased in small samples. Our estimation method employs an augmented regression which uses a proxy for the errors in the autoregressive model. We also develop a heuristic estimator of the standard error of the estimated predictive coefficient which performs well in simulations. We analyze the case of multiple predictors that are firstorder autoregressive and derive bias expressions for both the ordinary least squares and our reducedbias estimated coefficients. The effectiveness of our estimation method is demonstrated by simulations.
Discounting the Distant Future: How Much Do Uncertain Rates Increase Valuations?
 Journal of Environmental Economics and Management
, 2000
"... Costs and benefits in the distant futuresuch as those associated with global warming, longlived infrastructure, hazardous and radioactive waste, and biodiversityoften have little value today when measured with conventional discount rates. We demonstrate that when the future path of this conve ..."
Abstract

Cited by 42 (3 self)
 Add to MetaCart
Costs and benefits in the distant futuresuch as those associated with global warming, longlived infrastructure, hazardous and radioactive waste, and biodiversityoften have little value today when measured with conventional discount rates. We demonstrate that when the future path of this conventional rate is uncertain and persistent (i.e., highly correlated over time), the distant future should be discounted at lower rates than suggested by the current rate. We then use two centuries of data on U.S. interest rates to quantify this effect. Using both random walk and meanreverting models (which are indistinguishable based on historical data), we compute the certaintyequivalent ratethat is, the single discount rate that summarizes the effect of uncertainty and measures the appropriate forward rate of discount in the future. Using the random walk model, which we consider more compelling, we find that the certaintyequivalent rate falls from 3% now to 2% after 100 years, to 1% af...