Results 1  10
of
53
Testing for Indeterminacy: An Application to U.S. Monetary Policy
, 2003
"... This paper considers a prototypical monetary business cycle model for the U.S. economy, in which the equilibrium is undetermined if monetary policy is `passive'. In previous multivariate studies it has been common practice to restrict parameter estimates to values for which the equilibrium i ..."
Abstract

Cited by 136 (5 self)
 Add to MetaCart
This paper considers a prototypical monetary business cycle model for the U.S. economy, in which the equilibrium is undetermined if monetary policy is `passive'. In previous multivariate studies it has been common practice to restrict parameter estimates to values for which the equilibrium is unique. We show how the likelihoodbased estimation of dynamic stochastic general equilibrium models can be extended to allow for indeterminacies and sunspot fluctuations. We construct
Indirect inference and calibration of dynamic stochastic general equilibrium models
 Journal of Econometrics
, 2007
"... We advocate in this paper the use of a Sequential Partial Indirect Inference (SPII) approach, in order to account for calibration practice where dynamic stochastic general equilibrium models (DGSE) are studied only through their ability to reproduce some wellchosen moments. We stress that, despite ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
We advocate in this paper the use of a Sequential Partial Indirect Inference (SPII) approach, in order to account for calibration practice where dynamic stochastic general equilibrium models (DGSE) are studied only through their ability to reproduce some wellchosen moments. We stress that, despite a lack of statistical formalization, the controversial calibration methodology addresses a genuine issue on the consequences of misspecification in highly nonlinear and dynamic structural macromodels. Such likely misspecification is even more detrimental than for direct inference, since the misspecified model is used for building simulated paths. The only way to get robust estimators, but also to assess the model despite misspecification consists in examining the structural model through a convenient and parsimonious instrumental model, which basically does not capture what goes wrong in the simulated paths. We argue that a welldriven SPII strategy might be seen as a rigorous calibrationnist approach, that captures both the advantages of this approach (accounting for structural “astatistical” ideas) and of the inferential approach (precise appraisal of loss functions and conditions of validity). This methodology should be useful for the empirical assessment of structural models such as those stemming from the Real Business Cycle theory or the asset pricing literature.
DSGE Models in a DataRich Environment
 NBER WORKING PAPERS 12772, NATIONAL BUREAU OF ECONOMIC RESEARCH, INC
, 2005
"... Standard practice for the estimation of dynamic stochastic general equilibrium (DSGE) models maintains the assumption that economic variables are properly measured by a single indicator, and that all relevant information for the estimation is summarized by a small number of data series. However, rec ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
Standard practice for the estimation of dynamic stochastic general equilibrium (DSGE) models maintains the assumption that economic variables are properly measured by a single indicator, and that all relevant information for the estimation is summarized by a small number of data series. However, recent empirical research on factor models has shown that information contained in large data sets is relevant for the evolution of important macroeconomic series. This suggests that conventional model estimates and inference based on estimated DSGE models are likely to be distorted. In this paper, we propose an empirical framework for the estimation of DSGE models that exploits the relevant information from a datarich environment. This framework provides an interpretation of all information contained in a large data set, and in particular of the latent factors, through the lenses of a DSGE model. The estimation involves Bayesian MarkovChain MonteCarlo (MCMC) methods extended so that the estimates can, in some cases, inherit the properties of classical maximum likelihood estimation. We apply this estimation approach to a stateoftheart DSGE monetary model. Treating theoretical concepts of the model – such as output, inflation and employment – as partially observed, we show that the information from a large set of macroeconomic indicators is important for accurate estimation of the model. It also allows us to improve the forecasts of important economic variables.
How Much Inflation is Necessary to Grease the Wheels?
, 2008
"... This paper studies Tobin's proposition that inflation "\greases" the wheels of the labor market. The analysis is carried out using a simple dynamic stochastic general equilibrium model with asymmetric wage adjustment costs. Optimal inflation is determined by a benevolent government th ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
This paper studies Tobin's proposition that inflation "\greases" the wheels of the labor market. The analysis is carried out using a simple dynamic stochastic general equilibrium model with asymmetric wage adjustment costs. Optimal inflation is determined by a benevolent government that maximizes the households' welfare. The Simulated Method of Moments is used to estimate the nonlinear model based on its secondorder approximation. Econometric results indicate that nominal wages are downwardly rigid and that the optimal level of grease inflation for the U.S. economy is about 1.2 percent per year, with a 95 % confidence interval ranging from 0.2 to 1.6 percent.
Estimation of dsge models when the data are persistent
 NBER Working Papers 15187, National Bureau of Economic Research, Inc
, 2009
"... Abstract An active area of research in macroeconomics is to take DSGE models to the data. These models are often solved and estimated under specific assumptions about how the exogenous variables grow over time. In this paper, we first show that if the trends assumed for the model are incompatible w ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
Abstract An active area of research in macroeconomics is to take DSGE models to the data. These models are often solved and estimated under specific assumptions about how the exogenous variables grow over time. In this paper, we first show that if the trends assumed for the model are incompatible with the observed data, or that the detrended data used in estimation are inconsistent with the stationarity concepts of the model, the estimates can be severely biased even in large samples. Estimates of parameters governing transmission mechanisms can be severely biased. We then consider four estimators that are robust to whether shocks in the model are assumed to be permanent or transitory and do not require the researcher to take a stand on the dynamic properties of the data. Simulations show that when the shocks are not persistent, the proposed estimators are as precise as estimators that correctly impose the stationarity assumption. But when the shocks are highly persistent yet stationary, the proposed estimators are much more precise. These properties hold even when there are multiple persistent shocks.
Devaluations, output and the balance sheet effect: a structural econometric analysis
, 2006
"... ..."
Testing for Weak Identification in Possibly Nonlinear Models
, 2010
"... In this paper we propose a chisquare test for identification. Our proposed test statistic is based on the distance between two shrinkage extremum estimators. The two estimators converge in probability to the same limit when identification is strong, and their asymptotic distributions are different ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
In this paper we propose a chisquare test for identification. Our proposed test statistic is based on the distance between two shrinkage extremum estimators. The two estimators converge in probability to the same limit when identification is strong, and their asymptotic distributions are different when identification is weak. The proposed test is consistent not only for the alternative hypothesis of no identification but also for the alternative of weak identification, which is confirmed by our Monte Carlo results. We apply the proposed technique to test whether the structural parameters of a representative Taylorrule monetary policy reaction function are identified.
2011): “Dynamic Identification of DSGE Models
 Econometrica
, 1995
"... Abstract This paper studies structural identification of parameters of a DSGE model from the first and second moments of the data. Classical results for dynamic simultaneous equations do not apply because the state space solution of the model does not constitute a standard reduced form. The rank of ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract This paper studies structural identification of parameters of a DSGE model from the first and second moments of the data. Classical results for dynamic simultaneous equations do not apply because the state space solution of the model does not constitute a standard reduced form. The rank of the Jacobian matrix of derivatives with respect to the parameters is necessary but not sufficient for identification. We use restrictions implied by observational equivalence to obtain two sets of rank and order conditions: one for stochastically singular and another for nonsingular models. Measurement errors, mean, long run, and a priori restrictions can be accommodated. An example is considered to illustrate the results.
INDIRECT LIKELIHOOD INFERENCE
, 2011
"... ABSTRACT. Given a sample from a fully specified parametric model, let Zn be a given finitedimensional statistic for example, an initial estimator or a set of sample moments. We propose to (re)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirec ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
ABSTRACT. Given a sample from a fully specified parametric model, let Zn be a given finitedimensional statistic for example, an initial estimator or a set of sample moments. We propose to (re)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirect likelihood (MIL) estimator. We also propose a computationally tractable Bayesian version of the estimator which we refer to as a Bayesian Indirect Likelihood (BIL) estimator. In most cases, the density of the statistic will be of unknown form, and we develop simulated versions of the MIL and BIL estimators. We show that the indirect likelihood estimators are consistent and asymptotically normally distributed, with the same asymptotic variance as that of the corresponding efficient twostep GMM estimator based on the same statistic. However, our likelihoodbased estimators, by taking into account the full finitesample distribution of the statistic, are higher order efficient relative to GMMtype estimators. Furthermore, in many cases they enjoy a bias reduction property similar to that of the indirect inference estimator. Monte Carlo results for a number of applications including dynamic and nonlinear panel data models, a structural auction model and two DSGE models show that the proposed estimators indeed have attractive finite sample properties.
2009) Estimation of Dynamic Latent Variable Models Using Simulated Nonparametric Moments
"... ABSTRACT. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the param ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
ABSTRACT. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to de ne the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. The estimator is consistent and has the same asymptotic distribution as that of the infeasible GMM estimator based on the same moment conditions. Monte Carlo results show how the estimatod may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models. An application to weekly spot exchange rate data further illustrates use of the estimator.