Results 1  10
of
37
Using Wavelets to Obtain a Consistent Ordinary Least Squares Estimator of the Longmemory Parameter
 Journal of Forecasting
, 1999
"... We develop an ordinary least squares estimator of the long memory parameter from a fractionally integrated process that is an alternative to the Geweke PorterHudak estimator. Using the wavelet transform from a fractionally integrated process, we establish a loglinear relationship between the wavel ..."
Abstract

Cited by 29 (6 self)
 Add to MetaCart
We develop an ordinary least squares estimator of the long memory parameter from a fractionally integrated process that is an alternative to the Geweke PorterHudak estimator. Using the wavelet transform from a fractionally integrated process, we establish a loglinear relationship between the wavelet coe cients ' variance and the scaling parameter equal to the long memory parameter. This loglinear relationship yields a consistent ordinary least squares estimator of the long memory parameter when the wavelet coe cients ' population variance is replaced by their sample variance. We derive the small sample bias and variance of the ordinary least squares estimator and test it against the Geweke PorterHudak estimator and the McCoy Walden maximum likelihood wavelet estimator by conducting a numberofMonte Carlo experiments. Based upon the criterion of choosing the estimator which minimizes the mean squared error, the wavelet OLS approach was superior to the Geweke PorterHudak estimator, but inferior to the McCoy Walden wavelet estimator for the processes simulated. However, given the simplicity of programming and running the wavelet OLS estimator and its statistical inference of the long memory parameter we feel the general practitioner will be attracted to wavelet OLS estimator. Keywords
An Empirical Analysis of Data Requirements for Financial Forecasting with Neural Networks
, 2001
"... Neural networks have been shown to be a promising tool for forecasting financial time series. Several design factors significantly impact the accuracy of neural network forecasts. These factors include selection of input variables, architecture of the network, and quantity of training data. The ques ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
Neural networks have been shown to be a promising tool for forecasting financial time series. Several design factors significantly impact the accuracy of neural network forecasts. These factors include selection of input variables, architecture of the network, and quantity of training data. The questions of input variable selection and system architecture design have been widely researched, but the corresponding question of how much information to use in producing highquality neural network models has not been adequately addressed. In this paper, the effects of different sizes of training sample sets on forecasting currency exchange rates are examined. It is shown that those neural networksgiven an appropriate amount of historical knowledge can forecast future currency exchange rates with 60 percent accuracy, while those neural networks trained on a larger training set have a worse forecasting performance. In addition to higherquality forecasts, the reduced training set sizes reduce development cost and time.
2006) "Residual logperiodogram inference for long run relationships
 Journal of Econometrics
, 2010
"... We assume that some consistent estimatorbβ of an equilibrium relation between nonstationary series integrated of order d ∈ (0.5, 1.5) is used to compute residuals ût = yt −bβxt (or differences thereof). We propose to apply the semiparametric logperiodogram regression to the (differenced) residuals ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
We assume that some consistent estimatorbβ of an equilibrium relation between nonstationary series integrated of order d ∈ (0.5, 1.5) is used to compute residuals ût = yt −bβxt (or differences thereof). We propose to apply the semiparametric logperiodogram regression to the (differenced) residuals in order to estimate or test the degree of persistence δ of the equilibrium deviation ut. Providedbβ converges fast enough, we describe simple semiparametric conditions around zero frequency that guarantee consistent estimation of δ. At the same time limiting normality is derived, which allows to construct approximate confidence intervals to test hypotheses on δ. This requires that d − δ> 0.5 for superconsistentbβ, so the residuals can be good proxies of true cointegrating errors. Our assumptions allow for stationary deviations with long memory, 0 ≤ δ < 0.5, as well as for nonstationary but transitory equilibrium errors, 0.5 < δ < 1. In particular, if xt contains several series we consider the joint estimation of d and δ. Wald statistics to test for parameter restrictions of the system have a limiting χ 2 distribution. We also analyze the benefits of a pooled version of the estimate. The empirical applicability of our general cointegration test is investigated by means of Monte Carlo experiments and illustrated with a study of exchange rate dynamics. JEL Classification: C14, C22.
The Long Range Dependence Paradigm for Macroeconomics and Finance
, 2002
"... The long range dependence paradigm appears to be a suitable description of the data generating process for many observed economic time series. This is mainly due to the fact that it naturally characterizes time series displaying a high degree of persistence, in the form of a long lasting effect of u ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
The long range dependence paradigm appears to be a suitable description of the data generating process for many observed economic time series. This is mainly due to the fact that it naturally characterizes time series displaying a high degree of persistence, in the form of a long lasting effect of unanticipated shocks, yet exhibiting mean reversion. Whereas linear long range dependent time series models have been extensively used in macroeconomics, empirical evidence from financial time series prompted the development of nonlinear long range dependent time series models, in particular models of changing volatility. We discuss empirical evidence of long range dependence as well as the theoretical issues, both for economics and econometrics, such evidence has stimulated
An approximate wavelet MLE of short and long memory parameters
 Studies in Nonlinear Dynamics and Econometrics
, 1999
"... Abstract. By design a wavelet's strength rests in its ability to localize a process simultaneously in timescale space. The wavelet's ability to localize a time series in timescale space directly leads to the computational e ciency of the wavelet representation of a N N matrix operator by allowing ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Abstract. By design a wavelet's strength rests in its ability to localize a process simultaneously in timescale space. The wavelet's ability to localize a time series in timescale space directly leads to the computational e ciency of the wavelet representation of a N N matrix operator by allowing the N largest elements of the wavelet represented operator to represent the matrix operator [Devore, et al. (1992a) and (1992b)]. This property allows many dense matrices to have sparse representation when transformed by wavelets. In this paper we generalize the longmemory parameter estimator of McCoy and Walden (1996) to estimate simultaneously the short and longmemory parameters. Using the sparse wavelet representation of a matrix operator, we are able to approximate an ARFIMA models likelihood function with the series's wavelet coe cients and their variances. Maximization of this approximate likelihood function over the short and longmemory parameter space results in the approximate wavelet maximum likelihood estimates of the ARFIMA model. By simultaneously maximizing the likelihood function over both the short and longmemory parameters and using only the wavelet coe cient's variances, the approximate wavelet MLE provides a fast alternative to the frequencydomain MLE. Furthermore, the simulation studies found herein reveal the approximate wavelet MLE to be robust over the invertible parameter region of the ARFIMA model's moving average parameter, whereas the frequencydomain MLE dramatically deteriorates as the moving average parameter approaches the boundaries of invertibility.
Longrange Dependence in Daily Stock Volatilities
 J. Bus. Ec. Stat
, 2000
"... Recent empirical studies show that the squares of highfrequency stock returns are longrange dependent and can be modeled as fractionally integrated processes, using, for example, longmemory stochastic volatility models. Are such longrange dependencies common among stocks? Are they caused by t ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Recent empirical studies show that the squares of highfrequency stock returns are longrange dependent and can be modeled as fractionally integrated processes, using, for example, longmemory stochastic volatility models. Are such longrange dependencies common among stocks? Are they caused by the same sources of variation? In this paper, we classify daily stock returns of S&P 500 companies on the basis of the company's size and its business or industrial sector, and estimate the strength of longrange dependence in the stock volatilities using two different methods. Almost all of the companies analyzed exhibit strong persistence in volatility. We then use a canonical correlation method to identify common longrange dependent components in groups of companies, finding strong evidence in support of common persistence in volatility. Finally, we use a chisquared test to study the effects of company size and sector on the number of common longrange dependent volatility compon...
Estimating the Fractional Order of Integration of Interest Rates Using a Wavelet OLS Estimator
 PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING, HONG KONG
, 2001
"... ..."
Identifying Common Longrange Dependence in a Vector Time Series
 STATISTICS RESEARCH CENTER, GRADUATE SCHOOL OF BUSINESS, UNIVERSITY OF CHICAGO
, 1997
"... We propose a method to identify common persistent components in a kdimensional time series. Assuming that the individual series of the vector process have longrange dependence, we apply canonical correlation analysis to the series and its lagged values. A zero canonical correlation implies the ex ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We propose a method to identify common persistent components in a kdimensional time series. Assuming that the individual series of the vector process have longrange dependence, we apply canonical correlation analysis to the series and its lagged values. A zero canonical correlation implies the existence of a shortmemory linear combination, hence the existence of common longrange dependence; its associated eigenvector provides an estimate of the combination. We illustrate the technique using several real examples, including squared returns of various stock prices. Power and size of the proposed method are investigated via simulation using series generated from fractionally integrated models and from longmemory stochastic volatility models. Keywords: Canonical correlation; Cointegration; Common components; Longrange dependence 1 Introduction Stationary processes exhibiting longterm, persistent fluctuations have been observed in many areas, such as hydrology, meteorology, economi...