Results 1  10
of
569
Bayesian Analysis of Stochastic Volatility Models
, 1994
"... this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener alized ARCH ..."
Abstract

Cited by 371 (21 self)
 Add to MetaCart
this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener alized ARCH (GARCH) models [see Bollerslev, Chou, and Kroner (1992) for a survey of ARCH modeling], both the mean and logvolatility equations have separate error terms. The ease of evaluating the ARCH likelihood function and the ability of the ARCH specification to accommodate the timevarying volatility found in many economic time series has fostered an explosion in the use of ARCH models. On the other hand, the likelihood function for stochastic volatility models is difficult to evaluate, and hence these models have had limited empirical application
Emerging Equity Market Volatility
, 1997
"... Understanding volatility in emerging capital markets is important for determining the cost of capital and for evaluating direct investment and asset allocation decisions. We provide an approach that allows the relative importance of world and local information to change through time in both the expe ..."
Abstract

Cited by 157 (28 self)
 Add to MetaCart
Understanding volatility in emerging capital markets is important for determining the cost of capital and for evaluating direct investment and asset allocation decisions. We provide an approach that allows the relative importance of world and local information to change through time in both the expected returns and conditional variance processes. Our timeseries and crosssectional models analyze the reasons that volatility is different across emerging markets, particularly with respect to the timing of capital market reforms. We find that capital market liberalizations often increase the correlation between local market returns and the world market but do not drive up local market volatility.
Linear Regression Limit Theory for Nonstationary Panel Data
 Econometrica
, 1999
"... This paper develops a regression limit theory for nonstationary panel data with large numbers of cross section Ž n. and time series Ž T. observations. The limit theory allows for both sequential limits, wherein T� � followed by n��, and joint limits where T, n�� simultaneously; and the relationship ..."
Abstract

Cited by 137 (13 self)
 Add to MetaCart
This paper develops a regression limit theory for nonstationary panel data with large numbers of cross section Ž n. and time series Ž T. observations. The limit theory allows for both sequential limits, wherein T� � followed by n��, and joint limits where T, n�� simultaneously; and the relationship between these multidimensional limits is explored. The panel structures considered allow for no time series cointegration, heterogeneous cointegration, homogeneous cointegration, and nearhomogeneous cointegration. The paper explores the existence of longrun average relations between integrated panel vectors when there is no individual time series cointegration and when there is heterogeneous cointegration. These relations are parameterized in terms of the matrix regression coefficient of the longrun average covariance matrix. In the case of homogeneous and near homogeneous cointegrating panels, a panel fully modified regression estimator is developed and studied. The limit theory enables us to test hypotheses about the long run average parameters both within and between subgroups of the full population.
On the Detection and Estimation of Long Memory in Stochastic Volatility
, 1995
"... Recent studies have suggested that stock markets' volatility has a type of longrange dependence that is not appropriately described by the usual Generalized Autoregressive Conditional Heteroskedastic (GARCH) and Exponential GARCH (EGARCH) models. In this paper, different models for describing this ..."
Abstract

Cited by 125 (6 self)
 Add to MetaCart
Recent studies have suggested that stock markets' volatility has a type of longrange dependence that is not appropriately described by the usual Generalized Autoregressive Conditional Heteroskedastic (GARCH) and Exponential GARCH (EGARCH) models. In this paper, different models for describing this longrange dependence are examined and the properties of a LongMemory Stochastic Volatility (LMSV) model, constructed by incorporating an Autoregressive Fractionally Integrated Moving Average (ARFIMA) process in a stochastic volatility scheme, are discussed. Strongly consistent estimators for the parameters of this LMSV model are obtained by maximizing the spectral likelihood. The distribution of the estimators is analyzed by means of a Monte Carlo study. The LMSV is applied to daily stock market returns providing an improved description of the volatility behavior. In order to assess the empirical relevance of this approach, tests for longmemory volatility are described and applied to an e...
A Simple Panel Unit Root Test in the Presence of Cross Section Dependence
 JOURNAL OF APPLIED ECONOMETRICS
, 2006
"... A number of panel unit root tests that allow for cross section dependence have been proposed in the literature that use orthogonalization type procedures to asymptotically eliminate the cross dependence of the series before standard panel unit root tests are applied to the transformed series. In thi ..."
Abstract

Cited by 111 (13 self)
 Add to MetaCart
A number of panel unit root tests that allow for cross section dependence have been proposed in the literature that use orthogonalization type procedures to asymptotically eliminate the cross dependence of the series before standard panel unit root tests are applied to the transformed series. In this paper we propose a simple alternative where the standard ADF regressions are augmented with the cross section averages of lagged levels and firstdifferences of the individual series. New asymptotic results are obtained both for the individual cross sectionally augmented ADF (CADF) statistics, and their simple averages. It is shown that the individual CADF statistics are asymptotically similar and do not depend on the factor loadings. The limit distribution of the average CADF statistic is shown to exist and its critical values are tabulated. Small sample properties of the proposed test are investigated by Monte Carlo experiments. The proposed test is applied to a panel of 17 OECD real exchange rate series as well as to log real earnings of households in the PSID data.
Predictable Risk and Returns in Emerging Markets
, 1995
"... This article has a number of goals. First, the average or unconditional risk of these equity returns is studied. While previous authors have documented low correlations of the emerging market returns with developed country returns, I test whether adding emerging mar ket assets to the portfolio prob ..."
Abstract

Cited by 103 (11 self)
 Add to MetaCart
This article has a number of goals. First, the average or unconditional risk of these equity returns is studied. While previous authors have documented low correlations of the emerging market returns with developed country returns, I test whether adding emerging mar ket assets to the portfolio problem significantly shifts the investment opportunity set. I find that the addition of emerging market assets significantly enhances portfolio opportunities
2002b, “Regime Switches in Interest Rates
 Journal of Business and Economic Statistics
"... anonymous referees and seminar participants at Stanford University and the 1999 Econometric Society ..."
Abstract

Cited by 88 (8 self)
 Add to MetaCart
anonymous referees and seminar participants at Stanford University and the 1999 Econometric Society
Inattentive consumers
 Journal of Monetary Economics
, 2006
"... This paper studies the consumption decisions of agents who face costs of acquiring, absorbing and processing information. These consumers rationally choose to only sporadically update their information and recompute their optimal consumption plans. In between updating dates, they remain inattentive ..."
Abstract

Cited by 85 (6 self)
 Add to MetaCart
This paper studies the consumption decisions of agents who face costs of acquiring, absorbing and processing information. These consumers rationally choose to only sporadically update their information and recompute their optimal consumption plans. In between updating dates, they remain inattentive. This behavior implies that news disperses slowly throughout the population, so events have a gradual and delayed effect on aggregate consumption. The model predicts that aggregate consumption adjusts slowly to shocks, and is able to explain the excess sensitivity and excess smoothness puzzles. In addition, individual consumption is sensitive to ordinary and unexpected past news, but it is not sensitive to extraordinary or predictable events. The model further predicts that some people rationally choose to not plan, live handtomouth, and save less, while other people sporadically update their plans. The longer are these plans, the more they save. Evidence using U.S. aggregate and microeconomic data generally supports these predictions.
Why is it so Difficult to Beat the Random Walk Forecast of Exchange Rates
 Journal of International Economics
, 2003
"... Most TI discussion papers can be downloaded at ..."
The bootstrap
 In Handbook of Econometrics
, 2001
"... The bootstrap is a method for estimating the distribution of an estimator or test statistic by resampling one’s data. It amounts to treating the data as if they were the population for the purpose of evaluating the distribution of interest. Under mild regularity conditions, the bootstrap yields an a ..."
Abstract

Cited by 75 (1 self)
 Add to MetaCart
The bootstrap is a method for estimating the distribution of an estimator or test statistic by resampling one’s data. It amounts to treating the data as if they were the population for the purpose of evaluating the distribution of interest. Under mild regularity conditions, the bootstrap yields an approximation to the distribution of an estimator or test statistic that is at least as accurate as the