Results 11 - 20
of
212
Exchange Rate Returns Standardized by Realized Volatility are (Nearly) Gaussian
, 1999
"... Introduction and Basic Ideas The prescriptions of modern financial risk management hinge critically on the associated characterization of the distribution of future returns (cf., Diebold, Gunther and Tay, 1998, and Diebold, Hahn and Tay, 1999). Because volatility persistence renders high-frequency ..."
Abstract
-
Cited by 71 (11 self)
- Add to MetaCart
Introduction and Basic Ideas The prescriptions of modern financial risk management hinge critically on the associated characterization of the distribution of future returns (cf., Diebold, Gunther and Tay, 1998, and Diebold, Hahn and Tay, 1999). Because volatility persistence renders high-frequency returns temporally dependent (e.g., Bollerslev, Chou and Kroner, 1992), it is the conditional return distribution, and not the unconditional distribution, that is of relevance for risk management. This is especially true in high-frequency situations, such as monitoring and managing the risk associated with the day-to-day operations of a trading desk, where volatility clustering is omnipresent. Exchange rate returns are well-known to be unconditionally symmetric but highly leptokurtic. Standardized daily or weekly returns from ARCH and related stochastic volatility models also appear symmetric but leptokurtic; that is, the distributions are not only unconditionally, bu
The Stochastic Conditional Duration Model: A Latent Factor Model For The Analysis Of Financial Durations
- Journal of Econometrics
, 1999
"... A new model for the analysis of durations, the stochastic conditional duration (SCD) model, is introduced. This model is based of the assumption that the durations are generated by a latent stochastic factor that follows a first order autoregressive process. The latent factor is pertubed multiplicat ..."
Abstract
-
Cited by 70 (9 self)
- Add to MetaCart
A new model for the analysis of durations, the stochastic conditional duration (SCD) model, is introduced. This model is based of the assumption that the durations are generated by a latent stochastic factor that follows a first order autoregressive process. The latent factor is pertubed multiplicatively by an innovation distributed as a Weibull or gamma variable. The model can capture a wide range of shapes of hazard functions. The estimation of the parameters is performed by quasi-maximum likelihood, after transforming the original nonlinear model into a space state representation and using the Kalman filter. The model is applied to stock market price-durations, looking at the relation between price durations, volume, spread and trading intensity.
News Arrival, Jump Dynamics, and Volatility Components for Individual Stock Returns
, 2003
"... This paper models components of the return distribution, which are assumed to be directed by a latent news process. The conditional variance of returns is a combination of jumps and smoothly changing components. A heterogeneous Poisson process with a time-varying conditional intensity parameter gove ..."
Abstract
-
Cited by 68 (3 self)
- Add to MetaCart
This paper models components of the return distribution, which are assumed to be directed by a latent news process. The conditional variance of returns is a combination of jumps and smoothly changing components. A heterogeneous Poisson process with a time-varying conditional intensity parameter governs the likelihood of jumps. Unlike typical jump models with stochastic volatility, previous realizations of both jump and normal innovations can feed back asymmetrically into expected volatility. This model improves forecasts of volatility, particularly after large changes in stock returns. We provide empirical evidence of the impact and feedback effects of jump versus normal return innovations, leverage effects, and the time-series dynamics of jump clustering. THERE IS A WIDE-SPREAD PERCEPTION in the financial press that volatility of asset returns has been changing. The new economy is introducing more uncertainty. Indeed, it can be argued that volatility is being transferred from the economy at large into the financial markets, which bear the necessary adjustment shocks. 1
No-arbitrage semi-martingale restrictions for continuous-time volatility models subject to leveral effects and jumps: Theory and . . .
, 2005
"... Modeling financial market volatility has been a thriving research area over the last couple of decades. The topic speaks to fundamental risk and asset pricing issues with important applications in areas relating to portfolio allocation, risk management and measurement of systematic macroeconomic ris ..."
Abstract
-
Cited by 51 (8 self)
- Add to MetaCart
Modeling financial market volatility has been a thriving research area over the last couple of decades. The topic speaks to fundamental risk and asset pricing issues with important applications in areas relating to portfolio allocation, risk management and measurement of systematic macroeconomic risk exposures. At the same time, it also provides a unique set of
Which Moments to Match
- EconometricTheory
, 1996
"... JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JS ..."
Abstract
-
Cited by 39 (1 self)
- Add to MetaCart
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org. Cambridge University Press is collaborating with JSTOR to digitize, preserve and extend access to
The Long Range Dependence Paradigm for Macroeconomics and Finance
, 2002
"... The long range dependence paradigm appears to be a suitable description of the data generating process for many observed economic time series. This is mainly due to the fact that it naturally characterizes time series displaying a high degree of persistence, in the form of a long lasting effect of ..."
Abstract
-
Cited by 35 (1 self)
- Add to MetaCart
The long range dependence paradigm appears to be a suitable description of the data generating process for many observed economic time series. This is mainly due to the fact that it naturally characterizes time series displaying a high degree of persistence, in the form of a long lasting effect of unanticipated shocks, yet exhibiting mean reversion. Whereas linear long range dependent time series models have been extensively used in macroeconomics, empirical evidence from financial time series prompted the development of nonlinear long range dependent time series models, in particular models of changing volatility. We discuss empirical evidence of long range dependence as well as the theoretical issues, both for economics and econometrics, such evidence has stimulated.
The Analysis Of Foreign Exchange Data Using Waveform Dictionaries
- Journal of Empirical Finance
, 1995
"... . This paper uses waveform dictionaries to decompose the signals contained within three foreign exchange rates using tick-by-tick observations obtained world wide. The three exchange rates examined are the Japanese Yen and the German Deutsche Mark against the U.S. dollar and the Deutsche Mark agains ..."
Abstract
-
Cited by 32 (3 self)
- Add to MetaCart
(Show Context)
. This paper uses waveform dictionaries to decompose the signals contained within three foreign exchange rates using tick-by-tick observations obtained world wide. The three exchange rates examined are the Japanese Yen and the German Deutsche Mark against the U.S. dollar and the Deutsche Mark against the Yen. The data were provided by Olsen Associates. A waveform dictionary is a class of transforms that generalizes both windowed Fourier transforms and wavelets. Each wave form is parameterized by location, frequency, and scale. Such transforms can analyze signals that have highly localized structures in either time or frequency space as well as broad band structures; that is, waveforms can, in principle, detect everything from shocks represented by Dirac Delta functions, to "chirps", short bursts of energy within a narrow band of frequencies, to the presence of frequencies that occur sporadically, and finally to the presence of frequencies that hold over the entire observed period. Wave...
Extremes of stochastic volatility models
- Extremes of Stochastic Volatility Models 9
, 1998
"... The simple stochastic volatility process (Xt)t∈Z is given by the equation Xt = σt Zt, t ∈ Z, (1) where (Zt) is iid, (σt)t∈Z is the log-linear Gaussian process given by 2 logσt = j=0 ψjηt−j, with j=0 ψ 2 j <∞, and the sequence (ηt) is iid N(0, τ2) and independent of (Zt). If var(Zt) < ∞, then i ..."
Abstract
-
Cited by 30 (14 self)
- Add to MetaCart
(Show Context)
The simple stochastic volatility process (Xt)t∈Z is given by the equation Xt = σt Zt, t ∈ Z, (1) where (Zt) is iid, (σt)t∈Z is the log-linear Gaussian process given by 2 logσt = j=0 ψjηt−j, with j=0 ψ 2 j <∞, and the sequence (ηt) is iid N(0, τ2) and independent of (Zt). If var(Zt) < ∞, then it is customary to assume that (Zt) is iid with mean 0 and variance 1. In this article, we describe the limiting behavior of the sample maxima, Mn = max(X1,..., Xn), of the strictly stationary stochastic volatility sequence (Xt) in the cases that the noise (Zt) has either a light- or heavy-tailed distribution. In Section 1, we describe the tail behavior of the marginal distribution of X1. Point process convergence based on the normalized process is described in Section 2. This provides the key result from which limiting behavior of the extremes of (Xt) can be determined. Interestingly, and unlike the situation for GARCH processes (see Davis and Mikosch [5]), there is no extremal clustering for stochastic volatility processes in both the light- and heavy-tailed cases. That is, large values of the processes do not come in clusters. More precisely, the large sample behavior of Mn is the same as that of the maxima of the associated iid sequence (X̂t), where
Aggregation and Model Construction for Volatility Models," unpublished manuscript, Nuffield College,
, 1998
"... ..."
Improved Quasi-Maximum Likelihood Estimation for Stochastic Volatility Models
- Modelling and Prediction: Honoring Seymour Geisser
, 1995
"... Jacquier, Polson and Rossi (1994, Journal of Business and Economic Statistics) have proposed a Bayesian hierarchical model and Markov Chain Monte Carlo methodology for parameter estimation and smoothing in a stochastic volatility model, where the logarithm of the conditional variance follows an auto ..."
Abstract
-
Cited by 23 (1 self)
- Add to MetaCart
(Show Context)
Jacquier, Polson and Rossi (1994, Journal of Business and Economic Statistics) have proposed a Bayesian hierarchical model and Markov Chain Monte Carlo methodology for parameter estimation and smoothing in a stochastic volatility model, where the logarithm of the conditional variance follows an autoregressive process. In sampling experiments, their estimators perform particularly well relative to a quasi-maximum likelihood approach, in which the nonlinear stochastic volatility model is linearized via a logarithmic transformation and the resulting linear state-space model is treated as Gaussian. In this paper, we explore a simple modification to the treatment of inlier observations which reduces the excess kurtosis in the distribution of the observation disturbances and improves the performance of the quasi-maximum likelihood procedure. The method we propose can be carried out with commercial software. Keywords. Inliers, excess kurtosis, transformations. 1 Introduction Financial variab...