Results 11  20
of
2,324
Timevarying NAIRU and its implications for Economic Policy
 NBER Working Paper
, 1996
"... This paper estimates the NAIRU (standing for the ~on~ccelerating Inflation Rate of unemployment) as a parameter that varies over time. The NAIRU is the unemployment rate that is consistent with a constant rate of inflation. Its value is determined in an econometric model in which the inflation rate ..."
Abstract

Cited by 177 (2 self)
 Add to MetaCart
This paper estimates the NAIRU (standing for the ~on~ccelerating Inflation Rate of unemployment) as a parameter that varies over time. The NAIRU is the unemployment rate that is consistent with a constant rate of inflation. Its value is determined in an econometric model in which the inflation rate depends on its own past values (“inertia”), demand shocks proxied by the difference between the actual unemployment rate and the estimated NAIRU, and a set of supply shock variables. The estimated NAIRU for the U.S. economy differs somewhat for alternative measures of the inflation rate. The NAIRU estimated for the GDP deflator varies over the past forty years within the narrow range of 5.7 to 6.4 percent; its estimated value for the most recent quarter(1996:Q1) is 5.7 percent. In that quarter a lower NAIRU of 5.3 percent is obtained for the chainweighted PCE deflator. Recent research claiming that there is a threepercentagepoint range of uncertainty about the NAIRU is rejected as inconsistent with the behavior of the American economy in the late 1980s and early 1990s.
Post'87 Crash Fears in the S&P 500 Futures Option Market
, 1998
"... Postcrash distributions inferred from S ..."
Hidden Markov processes
 IEEE Trans. Inform. Theory
, 2002
"... Abstract—An overview of statistical and informationtheoretic aspects of hidden Markov processes (HMPs) is presented. An HMP is a discretetime finitestate homogeneous Markov chain observed through a discretetime memoryless invariant channel. In recent years, the work of Baum and Petrie on finite ..."
Abstract

Cited by 174 (3 self)
 Add to MetaCart
Abstract—An overview of statistical and informationtheoretic aspects of hidden Markov processes (HMPs) is presented. An HMP is a discretetime finitestate homogeneous Markov chain observed through a discretetime memoryless invariant channel. In recent years, the work of Baum and Petrie on finitestate finitealphabet HMPs was expanded to HMPs with finite as well as continuous state spaces and a general alphabet. In particular, statistical properties and ergodic theorems for relative entropy densities of HMPs were developed. Consistency and asymptotic normality of the maximumlikelihood (ML) parameter estimator were proved under some mild conditions. Similar results were established for switching autoregressive processes. These processes generalize HMPs. New algorithms were developed for estimating the state, parameter, and order of an HMP, for universal coding and classification of HMPs, and for universal decoding of hidden Markov channels. These and other related topics are reviewed in this paper. Index Terms—Baum–Petrie algorithm, entropy ergodic theorems, finitestate channels, hidden Markov models, identifiability, Kalman filter, maximumlikelihood (ML) estimation, order estimation, recursive parameter estimation, switching autoregressive processes, Ziv inequality. I.
Modelling gene expression data using dynamic bayesian networks
, 1999
"... Recently, there has been much interest in reverse engineering genetic networks from time series data. In this paper, we show that most of the proposed discrete time models — including the boolean network model [Kau93, SS96], the linear model of D’haeseleer et al. [DWFS99], and the nonlinear model of ..."
Abstract

Cited by 158 (1 self)
 Add to MetaCart
Recently, there has been much interest in reverse engineering genetic networks from time series data. In this paper, we show that most of the proposed discrete time models — including the boolean network model [Kau93, SS96], the linear model of D’haeseleer et al. [DWFS99], and the nonlinear model of Weaver et al. [WWS99] — are all special cases of a general class of models called Dynamic Bayesian Networks (DBNs). The advantages of DBNs include the ability to model stochasticity, to incorporate prior knowledge, and to handle hidden variables and missing data in a principled way. This paper provides a review of techniques for learning DBNs. Keywords: Genetic networks, boolean networks, Bayesian networks, neural networks, reverse engineering, machine learning. 1
Variational learning for switching statespace models
 Neural Computation
, 1998
"... We introduce a new statistical model for time series which iteratively segments data into regimes with approximately linear dynamics and learns the parameters of each of these linear regimes. This model combines and generalizes two of the most widely used stochastic time series models  hidden Ma ..."
Abstract

Cited by 145 (6 self)
 Add to MetaCart
We introduce a new statistical model for time series which iteratively segments data into regimes with approximately linear dynamics and learns the parameters of each of these linear regimes. This model combines and generalizes two of the most widely used stochastic time series models  hidden Markov models and linear dynamical systems  and is closely related to models that are widely used in the control and econometrics literatures. It can also be derived by extending the mixture of experts neural network (Jacobs et al., 1991) to its fully dynamical version, in which both expert and gating networks are recurrent. Inferring the posterior probabilities of the hidden states of this model is computationally intractable, and therefore the exact Expectation Maximization (EM) algorithm cannot be applied. However, we present a variational approximation that maximizes a lower bound on the log likelihood and makes use of both the forwardbackward recursions for hidden Markov models and the Kalman lter recursions for linear dynamical systems. We tested the algorithm both on artificial data sets and on a natural data set of respiration force from a patient with sleep apnea. The results suggest that variational approximations are a viable method for inference and learning in switching statespace models.
Rangebased estimation of stochastic volatility models
, 2002
"... We propose using the price range in the estimation of stochastic volatility models. We show theoretically, numerically, and empirically that rangebased volatility proxies are not only highly efficient, but also approximately Gaussian and robust to microstructure noise. Hence rangebased Gaussian qu ..."
Abstract

Cited by 125 (11 self)
 Add to MetaCart
We propose using the price range in the estimation of stochastic volatility models. We show theoretically, numerically, and empirically that rangebased volatility proxies are not only highly efficient, but also approximately Gaussian and robust to microstructure noise. Hence rangebased Gaussian quasimaximum likelihood estimation produces highly efficient estimates of stochastic volatility models and extractions of latent volatility. We use our method to examine the dynamics of daily exchange rate volatility and find the evidence points strongly toward twofactor models with one highly persistent factor and one quickly meanreverting factor. VOLATILITY IS A CENTRAL CONCEPT in finance, whether in asset pricing, portfolio choice, or risk management. Not long ago, theoretical models routinely assumed constant volatility ~e.g., Merton ~1969!, Black and Scholes ~1973!!. Today, however, we widely acknowledge that volatility is both time varying and predictable ~e.g., Andersen and Bollerslev ~1997!!, andstochastic volatility models are commonplace. Discrete and continuoustime stochastic volatility models are extensively used in theoretical finance, empirical finance, and financial econometrics, both in academe and industry ~e.g., Hull and
A Simple Panel Unit Root Test in the Presence of Cross Section Dependence
 JOURNAL OF APPLIED ECONOMETRICS
, 2006
"... A number of panel unit root tests that allow for cross section dependence have been proposed in the literature that use orthogonalization type procedures to asymptotically eliminate the cross dependence of the series before standard panel unit root tests are applied to the transformed series. In thi ..."
Abstract

Cited by 121 (13 self)
 Add to MetaCart
A number of panel unit root tests that allow for cross section dependence have been proposed in the literature that use orthogonalization type procedures to asymptotically eliminate the cross dependence of the series before standard panel unit root tests are applied to the transformed series. In this paper we propose a simple alternative where the standard ADF regressions are augmented with the cross section averages of lagged levels and firstdifferences of the individual series. New asymptotic results are obtained both for the individual cross sectionally augmented ADF (CADF) statistics, and their simple averages. It is shown that the individual CADF statistics are asymptotically similar and do not depend on the factor loadings. The limit distribution of the average CADF statistic is shown to exist and its critical values are tabulated. Small sample properties of the proposed test are investigated by Monte Carlo experiments. The proposed test is applied to a panel of 17 OECD real exchange rate series as well as to log real earnings of households in the PSID data.
Temporal Texture Modeling
 In IEEE International Conference on Image Processing
, 1996
"... Temporal textures are textures with motion. Examples include wavy water, rising steam and fire. We model image sequences of temporal textures using the spatiotemporal autoregressive model (STAR). This model expresses each pixel as a linear combination of surrounding pixels lagged both in space and ..."
Abstract

Cited by 118 (1 self)
 Add to MetaCart
Temporal textures are textures with motion. Examples include wavy water, rising steam and fire. We model image sequences of temporal textures using the spatiotemporal autoregressive model (STAR). This model expresses each pixel as a linear combination of surrounding pixels lagged both in space and in time. The model provides a base for both recognition and synthesis. We show how the least squares method can accurately estimate model parameters for large, causal neighborhoods with more than 1000 parameters. Synthesis results show that the model can adequately capture the spatial and temporal characteristics of many temporal textures. A 95% recognition rate is achieved for a 135 element database with 15 texture classes. 1.
From Tweets to Polls : Linking Text Sentiment to Public Opinion Time Series
, 2010
"... We connect measures of public opinion measured from polls with sentiment measured from text. We analyze several surveys on consumer confidence and political opinion over the 2008 to 2009 period, and find they correlate to sentiment word frequencies in contemporaneous Twitter messages. While our resu ..."
Abstract

Cited by 110 (6 self)
 Add to MetaCart
We connect measures of public opinion measured from polls with sentiment measured from text. We analyze several surveys on consumer confidence and political opinion over the 2008 to 2009 period, and find they correlate to sentiment word frequencies in contemporaneous Twitter messages. While our results vary across datasets, in several cases the correlations are as high as 80%, and capture important largescale trends. The results highlight the potential of text streams as a substitute and supplement for traditional polling.
Hedge Funds and the Technology Bubble
 THE JOURNAL OF FINANCE • VOL. LIX, NO. 5 • OCTOBER 2004
, 2004
"... This paper documents that hedge funds did not exert a correcting force on stock prices during the technology bubble. Instead, they were heavily invested in technology stocks. This does not seem to be the result of unawareness of the bubble: Hedge funds captured the upturn, but, by reducing their pos ..."
Abstract

Cited by 106 (6 self)
 Add to MetaCart
This paper documents that hedge funds did not exert a correcting force on stock prices during the technology bubble. Instead, they were heavily invested in technology stocks. This does not seem to be the result of unawareness of the bubble: Hedge funds captured the upturn, but, by reducing their positions in stocks that were about to decline, avoided much of the downturn. Our findings question the efficient markets notion that rational speculators always stabilize prices. They are consistent with models in which rational investors may prefer to ride bubbles because of predictable investor sentiment and limits to arbitrage.