Results 1  10
of
420
Lag length selection and the construction of unit root tests with good size and power
 Econometrica
, 2001
"... It is widely known that when there are errors with a movingaverage root close to −1, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and the BIC tend to select a truncation lag (k) that is very small. We conside ..."
Abstract

Cited by 558 (14 self)
 Add to MetaCart
It is widely known that when there are errors with a movingaverage root close to −1, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and the BIC tend to select a truncation lag (k) that is very small. We consider a class of Modified Information Criteria (MIC) with a penalty factor that is sample dependent. It takes into account the fact that the bias in the sum of the autoregressive coefficients is highly dependent on k and adapts to the type of deterministic components present. We use a local asymptotic framework in which the movingaverage root is local to −1 to document how the MIC performs better in selecting appropriate values of k. In montecarlo experiments, the MIC is found to yield huge size improvements to the DF GLS and the feasible point optimal PT test developed in Elliott, Rothenberg and Stock (1996). We also extend the M tests developed in Perron and Ng (1996) to allow for GLS detrending of the data. The MIC along with GLS detrended data yield a set of tests with desirable size and power properties.
Testing for Common Trends
 Journal of the American Statistical Association
, 1988
"... Cointegrated multiple time series share at least one common trend. Two tests are developed for the number of common stochastic trends (i.e., for the order of cointegration) in a multiple time series with and without drift. Both tests involve the roots of the ordinary least squares coefficient matrix ..."
Abstract

Cited by 464 (7 self)
 Add to MetaCart
(Show Context)
Cointegrated multiple time series share at least one common trend. Two tests are developed for the number of common stochastic trends (i.e., for the order of cointegration) in a multiple time series with and without drift. Both tests involve the roots of the ordinary least squares coefficient matrix obtained by regressing the series onto its first lag. Critical values for the tests are tabulated, and their power is examined in a Monte Carlo study. Economic time series are often modeled as having a unit root in their autoregressive representation, or (equivalently) as containing a stochastic trend. But both casual observation and economic theory suggesthat many series might contain the same stochastic trendso that they are cointegrated. If each of n series is integrated of order 1 but can be jointly characterized by k < n stochastic trends, then the vecto representation of these series has k unit roots and n k distinct stationary linear combinations. Our proposed tests can be viewed alternatively as tests of the number of common trends, linearly independent cointegrating vectors, or autoregressive unit roots of the vector process. Both of the proposed tests are asymptotically similar. The firstest (qf) is developed under the assumption that certain components of the process have a finiteorder vector autoregressive (VAR) representation, and the nuisance parameters are handled by estimating this VAR. The second test (q,) entails computing the eigenvalues of a corrected sample firstorder autocorrelation matrix, where the correction is essentially a sum of the autocovariance matrices. Previous researchers have found that U.S. postwar interest rates, taken individually, appear to be integrated of order 1. In addition, the theory of the term structure implies that yields on similar assets of different maturities will be cointegrated. Applying these tests to postwar U.S. data on the federal funds rate and the three and twelvemonth treasury bill rates providesupport for this prediction: The three interest rates appear to be cointegrated.
Long memory processes and fractional integration in Econometrics
 JOURNAL OF NOMETRI ELSEVIER JOURNAL OF ECONOMETRICS 73{1996) 5 59
, 1996
"... This paper provides a survey and review of the major econometric work on long memory processes, fractional integration, and their applications in economics and finance. Some of the definitions of long memory are reviewed, together with previous work in other disciplines. Section 3 describes the popu ..."
Abstract

Cited by 377 (0 self)
 Add to MetaCart
This paper provides a survey and review of the major econometric work on long memory processes, fractional integration, and their applications in economics and finance. Some of the definitions of long memory are reviewed, together with previous work in other disciplines. Section 3 describes the population characteristics of various long memory processes in the mean, including ARFIMA. Section 4 is concerned with estimation and examines emiparametric procedures in both *he frequency and time domain, and also the properties of various regression based and maximum likelihood techniques. Long memory volatility processes are discussed in Section 5, while Section 6 discusses applications in economics and finance. The paper also has a concluding section.
Empirics for Economic Growth and Convergence
 European Economic Review, Vol
, 1996
"... important in reshaping this article. X. SalaiMartin has generously donated insight and time to try and make me understand. He need not, however, agree with all my statements below. All calculations were performed using the econometrics shell tsrf. Nontechnical Summary The convergence hypothesist ..."
Abstract

Cited by 202 (4 self)
 Add to MetaCart
(Show Context)
important in reshaping this article. X. SalaiMartin has generously donated insight and time to try and make me understand. He need not, however, agree with all my statements below. All calculations were performed using the econometrics shell tsrf. Nontechnical Summary The convergence hypothesisthat poor economies might \catch up" has generated a huge empirical literature: this paper critically reviews some of the earlier key ndings, clari es their implications, and relates them to more recent results. Particular attention is devoted to interpreting convergence empirics. The paper argues that relating them to growth theories, as usually done, gives but one interpretation to convergence dynamics; it does not exhaust their importance. Instead, if we relate convergence to the dynamics of income distributions, it broadens the issues on which such empirics can shed light; it connects with policy concerns on persistent or growing inequality, regional coreperiphery stagnation, and tendencies for ongoing capital ows across developed and developing countries. The main ndings are: (1) The muchheralded uniform 2 % rate of convergence could arise for reasons unrelated to the dynamics of economic growth. (2) Usual empirical analysescrosssection (conditional) convergence regressions, time series modelling, panel data analysiscan be misleading for understanding convergence; a model of polarization in economic growth clari es those di culties. (3) The data, more revealingly modelled, show persistence and immobility across countries: some evidence supports Baumol's idea of \convergence clubs"; some evidence shows the poor getting poorer, and the rich richer, with the middle class vanishing. (4) Convergence, unambiguous up to sampling error, is observed across US states. Empirics for Economic Growth and Convergence by
Dynamic panel estimation and homogeneity testing under cross section dependence. Cowles Foundation Discussion Paper #1362,
, 2002
"... Summary This paper deals with cross section dependence, homogeneity restrictions and small sample bias issues in dynamic panel regressions. To address the bias problem we develop a panel approach to median unbiased estimation that takes account of cross section dependence. The estimators given here ..."
Abstract

Cited by 166 (8 self)
 Add to MetaCart
(Show Context)
Summary This paper deals with cross section dependence, homogeneity restrictions and small sample bias issues in dynamic panel regressions. To address the bias problem we develop a panel approach to median unbiased estimation that takes account of cross section dependence. The estimators given here considerably reduce the effects of bias and gain precision from estimating cross section error correlation. This paper also develops an asymptotic theory for tests of coefficient homogeneity under cross section dependence, and proposes a modified Hausman test to test for the presence of homogeneous unit roots. An orthogonalization procedure, based on iterated method of moments estimation, is developed to remove cross section dependence and permit the use of conventional and meta unit root tests with panel data. Some simulations investigating the finite sample performance of the estimation and test procedures are reported.
Discounting the Distant Future: How Much Do Uncertain Rates Increase Valuations?
 Journal of Environmental Economics and Management
, 2000
"... Costs and benefits in the distant futuresuch as those associated with global warming, longlived infrastructure, hazardous and radioactive waste, and biodiversityoften have little value today when measured with conventional discount rates. We demonstrate that when the future path of this conve ..."
Abstract

Cited by 97 (5 self)
 Add to MetaCart
(Show Context)
Costs and benefits in the distant futuresuch as those associated with global warming, longlived infrastructure, hazardous and radioactive waste, and biodiversityoften have little value today when measured with conventional discount rates. We demonstrate that when the future path of this conventional rate is uncertain and persistent (i.e., highly correlated over time), the distant future should be discounted at lower rates than suggested by the current rate. We then use two centuries of data on U.S. interest rates to quantify this effect. Using both random walk and meanreverting models (which are indistinguishable based on historical data), we compute the certaintyequivalent ratethat is, the single discount rate that summarizes the effect of uncertainty and measures the appropriate forward rate of discount in the future. Using the random walk model, which we consider more compelling, we find that the certaintyequivalent rate falls from 3% now to 2% after 100 years, to 1% af...
Longrun purchasing power parity during the recent float, Working paper 215
, 1990
"... This paper examines the relevance of longrun purchasing power parity (PPP). which allows for measurement errors. during the recent floating exchange rate period. Previous empirical studies generally fail to find support for longrun PPP over this period. In this paper the cointegration property of ..."
Abstract

Cited by 72 (4 self)
 Add to MetaCart
This paper examines the relevance of longrun purchasing power parity (PPP). which allows for measurement errors. during the recent floating exchange rate period. Previous empirical studies generally fail to find support for longrun PPP over this period. In this paper the cointegration property of exchange rates and prices is examined using a maximum likelihood procedure, and we find significant evidence favorable to longrun PPP. Further tests for symmetry and proportionality indicate that these two conditions are not generally consistent with the data. The results support the hypothesis of longrun PPP with measurement errors in prices. 1.