Results 1  10
of
402
Experimental Queueing Analysis with LongRange Dependent Packet Traffic
 IEEE/ACM Transactions on Networking
, 1996
"... Recent traffic measurement studies from a wide range of working packet networks have convincingly established the presence of significant statistical features that are characteristic of fractal traffic processes, in the sense that these features span many time scales. Of particular interest in packe ..."
Abstract

Cited by 346 (14 self)
 Add to MetaCart
Recent traffic measurement studies from a wide range of working packet networks have convincingly established the presence of significant statistical features that are characteristic of fractal traffic processes, in the sense that these features span many time scales. Of particular interest in packet traffic modeling is a property called longrange dependence, which is marked by the presence of correlations that can extend over many time scales. In this paper, we demonstrate empirically that, beyond its statistical significance in traffic measurements, longrange dependence has considerable impact on queueing performance, and is a dominant characteristic for a number of packet traffic engineering problems. In addition, we give conditions under which the use of compact and simple traffic models that incorporate longrange dependence in a parsimonious manner (e.g., fractional Brownian motion) is justified and can lead to new insights into the traffic management of highspeed networks. 1...
On the Detection and Estimation of Long Memory in Stochastic Volatility
, 1995
"... Recent studies have suggested that stock markets' volatility has a type of longrange dependence that is not appropriately described by the usual Generalized Autoregressive Conditional Heteroskedastic (GARCH) and Exponential GARCH (EGARCH) models. In this paper, different models for describing ..."
Abstract

Cited by 214 (6 self)
 Add to MetaCart
Recent studies have suggested that stock markets' volatility has a type of longrange dependence that is not appropriately described by the usual Generalized Autoregressive Conditional Heteroskedastic (GARCH) and Exponential GARCH (EGARCH) models. In this paper, different models for describing this longrange dependence are examined and the properties of a LongMemory Stochastic Volatility (LMSV) model, constructed by incorporating an Autoregressive Fractionally Integrated Moving Average (ARFIMA) process in a stochastic volatility scheme, are discussed. Strongly consistent estimators for the parameters of this LMSV model are obtained by maximizing the spectral likelihood. The distribution of the estimators is analyzed by means of a Monte Carlo study. The LMSV is applied to daily stock market returns providing an improved description of the volatility behavior. In order to assess the empirical relevance of this approach, tests for longmemory volatility are described and applied to an e...
A Wavelet Based Joint Estimator of the Parameters of LongRange Dependence.
, 1998
"... A joint estimator is presented for the two parameters that define the longrange dependence phenomenon in the simplest case. The estimator is based on the coefficients of a discrete wavelet decomposition, improving a recently proposed waveletbased estimator of the scaling parameter [4], as well as ..."
Abstract

Cited by 89 (14 self)
 Add to MetaCart
A joint estimator is presented for the two parameters that define the longrange dependence phenomenon in the simplest case. The estimator is based on the coefficients of a discrete wavelet decomposition, improving a recently proposed waveletbased estimator of the scaling parameter [4], as well as extending it to include the associated power parameter. An important feature is its conceptual and practical simplicity, consisting essentially in measuring the slope and the intercept of a linear fit after a discrete wavelet transform is performed, a very fast (O(n)) operation. Under well justified technical idealisations the estimator is shown to be unbiased and of minimum or close to minimum variance for the scale parameter, and asymptotically unbiased and efficient for the second parameter. Through theoretical arguments and numerical simulations it is shown that in practice, even for small data sets, the bias is very small and the variance close to optimal for both parameters. Closed for...
How markets slowly digest changes in supply and demand
, 2008
"... In this article we revisit the classic problem of tatonnement in price formation from a microstructure point of view, reviewing a recent body of theoretical and empirical work explaining how fluctuations in supply and demand are slowly incorporated into prices. Because revealed market liquidity is ..."
Abstract

Cited by 82 (10 self)
 Add to MetaCart
In this article we revisit the classic problem of tatonnement in price formation from a microstructure point of view, reviewing a recent body of theoretical and empirical work explaining how fluctuations in supply and demand are slowly incorporated into prices. Because revealed market liquidity is extremely low, large orders to buy or sell can only be traded incrementally, over periods of time as long as months. As a result order flow is a highly persistent longmemory process. Maintaining compatibility with market efficiency has profound consequences on price formation, on the dynamics of liquidity, and on the nature of impact. We review a body of theory that makes detailed quantitative predictions about the volume and time dependence of market impact, the bidask spread, order book dynamics, and volatility. Comparisons to data yield some encouraging successes. This framework suggests a novel interpretation of financial information, in which agents are at best only weakly informed and all have a similar and extremely noisy impact on prices. Most of the processed information appears to come from supply and demand itself, rather than from
2003), “Modeling Daily ValueatRisk Using Realized Volatility and ARCH Type Models
 Journal of Empirical Finance
"... In this paper we show how to compute a daily VaR measure for two stock indexes (CAC40 and SP500) using the onedayahead forecast of the daily realized volatility. The daily realized volatility is equal to the sum of the squared intraday returns over a given day and thus uses intraday information t ..."
Abstract

Cited by 68 (2 self)
 Add to MetaCart
(Show Context)
In this paper we show how to compute a daily VaR measure for two stock indexes (CAC40 and SP500) using the onedayahead forecast of the daily realized volatility. The daily realized volatility is equal to the sum of the squared intraday returns over a given day and thus uses intraday information to define an aggregated daily volatility measure. While the VaR specification based on an ARFIMAX(0,d,1)skewed Student model for the daily realized volatility provides adequate onedayahead VaR forecasts, it does not really improve on the performance of a VaR model based on the skewed Student APARCH model and estimated using daily data. Thus, for the two financial assets considered in an univariate framework, both methods seem to be equivalent. This paper also shows that daily returns standardized by the square root of the onedayahead forecast of the daily realized volatility are not normally distributed.
On estimation of the wavelet variance
 Biometrika
, 1995
"... The wavelet variance provides a scalebased decomposition of the process variance for a time series or random field. It has seen increasing use in geophysics, astronomy, genetics, hydrology, medical imaging, oceanography, soil science, signal processing and texture analysis. In practice, however, da ..."
Abstract

Cited by 68 (7 self)
 Add to MetaCart
The wavelet variance provides a scalebased decomposition of the process variance for a time series or random field. It has seen increasing use in geophysics, astronomy, genetics, hydrology, medical imaging, oceanography, soil science, signal processing and texture analysis. In practice, however, data collected in the form of a time series or random field often suffer from various types of contamination. We discuss the difficulties and limitations of existing contamination models (pure replacement models, additive outliers, level shift models and innovation outliers that hide themselves in the original time series) for robust nonparametric estimates of secondorder statistics. We then introduce a new model based upon the idea of scalebased multiplicative contamination. This model supposes that contamination can occur and affect data at certain scales and thus arises naturally in multiscale processes and in the wavelet variance context. For this new contamination model, we develop a full Mestimation theory for the wavelet variance and derive its large sample theory when the underlying time series or random field is Gaussian. Our approach treats the wavelet variance as a scale parameter and offers protection against contamination that operates additively on the log of squared wavelet coefficients and acts independently at different scales.
A waveletbased joint estimator of the parameters of longrange dependence
 IEEE Trans. Inform. Theory
, 1999
"... Abstract—A joint estimator is presented for the two parameters that define the longrange dependence phenomenon in the simplest case. The estimator is based on the coefficients of a discrete wavelet decomposition, improving a recently proposed waveletbased estimator of the scaling parameter [4], as ..."
Abstract

Cited by 66 (13 self)
 Add to MetaCart
(Show Context)
Abstract—A joint estimator is presented for the two parameters that define the longrange dependence phenomenon in the simplest case. The estimator is based on the coefficients of a discrete wavelet decomposition, improving a recently proposed waveletbased estimator of the scaling parameter [4], as well as extending it to include the associated power parameter. An important feature is its conceptual and practical simplicity, consisting essentially in measuring the slope and the intercept of a linear fit after a discrete wavelet transform is performed, a very fast (O(n)) operation. Under welljustified technical idealizations the estimator is shown to be unbiased and of minimum or close to minimum variance for the scale parameter, and asymptotically unbiased and efficient for the second parameter. Through theoretical arguments and numerical simulations it is shown that in practice, even for small data sets, the bias is very small and the variance close to optimal for both parameters. Closedform expressions are given for the covariance matrix of the estimator as a function of data length, and are shown by simulation to be very accurate even when the technical idealizations are not satisfied. Comparisons are made against two maximumlikelihood estimators. In terms of robustness and computational cost the wavelet estimator is found to be clearly superior and statistically its performance is comparable. We apply the tool to the analysis of Ethernet teletraffic data, completing an earlier study on the scaling parameter alone. Index Terms—Hurst parameter, longrange dependence, packet traffic, parameter estimation, telecommunications networks, timescale analysis, wavelet decomposition. I.
An Extensible Toolkit for Resource Prediction in Distributed Systems
, 1999
"... Abstract—RPS is a publicly available toolkit that allows a practitioner to straightforwardly create flexible online and offline resource prediction systems in which resources are represented by independent, periodically sampled, scalarvalued measurement streams. The systems predict the future value ..."
Abstract

Cited by 59 (22 self)
 Add to MetaCart
(Show Context)
Abstract—RPS is a publicly available toolkit that allows a practitioner to straightforwardly create flexible online and offline resource prediction systems in which resources are represented by independent, periodically sampled, scalarvalued measurement streams. The systems predict the future values of such streams from past values and are composed at runtime out of a large and extensible set of communicating components that are in turn constructed using RPS’s extensible sensor, prediction, wavelet, and communication libraries. This paper describes the design, implementation, and performance of RPS. We have used RPS extensively to evaluate predictive models and build online prediction systems for host load, Windows performance data, and network bandwidth. The computation and communication overheads involved in such systems are quite low. Index Terms—Distributed systems, performance of systems. æ 1
An Evaluation of Linear Models for Host Load Prediction
, 1998
"... This paper evaluates linear models for predicting the Digital Unix fivesecond load average from 1 to 30 seconds into the future. A detailed statistical study of a large number of load traces leads to consideration of the BoxJenkins models (AR, MA, ARMA, ARIMA), and the ARFIMA models (due to selfs ..."
Abstract

Cited by 58 (8 self)
 Add to MetaCart
This paper evaluates linear models for predicting the Digital Unix fivesecond load average from 1 to 30 seconds into the future. A detailed statistical study of a large number of load traces leads to consideration of the BoxJenkins models (AR, MA, ARMA, ARIMA), and the ARFIMA models (due to selfsimilarity.) These models, as well as a simple windowedmean scheme, are evaluated by running a large number of randomized testcases on the load traces. The main conclusions are that load is consistently predictable to a useful degree, and that the simpler models such as AR are sufficient for doing this prediction.
On Resource Management and QoS Guarantees For Long Range Dependent Traffic
 in Proc. IEEE INFOCOM '95
, 1994
"... It has been known for several years now that variablebitrate video sources are strongly autocorrelated. Recently, several studies have indicated that the resulting stochastic processes exhibit longrange dependence properties. This implies that large buffers at intermediate switching points may n ..."
Abstract

Cited by 57 (10 self)
 Add to MetaCart
(Show Context)
It has been known for several years now that variablebitrate video sources are strongly autocorrelated. Recently, several studies have indicated that the resulting stochastic processes exhibit longrange dependence properties. This implies that large buffers at intermediate switching points may not provide adequate delay performance for such classes of traffic in Broadband packetswitched networks (such as ATM). In this paper, we study the effect of longmemory processes on queue length statistics of a single queue system through a controlled fractionally differenced ARIMA(1; d; 0) input process. This process has two parameters OE 1 (0 OE 1 1) and d (0 d ! 1=2) representing an autoregressive component and a longrange dependent component, respectively. Results show that the queue length statistics studied (mean, variance and the 0:999 quantile) are proportional to e c1 OE 1 e c2 d ; where (c 1 ; c 2 ) are positive constants, and c 2 ? c 1 : The effect of the autocorrelation...