Results 11  20
of
142
Colored noise and computational inference in neurophysiological (fMRI) time series analysis: resampling methods in time and wavelet domains
 Human Brain Mapping
, 2001
"... Abstract: Even in the absence of an experimental effect, functional magnetic resonance imaging (fMRI) time series generally demonstrate serial dependence. This colored noise or endogenous autocorrelation typically hasdisproportionatespectralpoweratlowfrequencies,i.e.,itsspectrumis 1 flike.Variouspr ..."
Abstract

Cited by 69 (6 self)
 Add to MetaCart
Abstract: Even in the absence of an experimental effect, functional magnetic resonance imaging (fMRI) time series generally demonstrate serial dependence. This colored noise or endogenous autocorrelation typically hasdisproportionatespectralpoweratlowfrequencies,i.e.,itsspectrumis 1 flike.Variousprewhiteningand precoloringstrategieshavebeenproposedtomakevalidinferenceonstandardisedteststatisticsestimatedby time series regression in this context of residually autocorrelated errors. Here we introduce anew method based on random permutation after orthogonal transformation of the observed time series to the wavelet domain. This scheme exploits the general whitening or decorrelating property of the discrete wavelet transformandisimplementedusingaDaubechieswaveletwithfourvanishingmomentstoensureexchangeability of wavelet coefficients within each scale of decomposition. For 1like or fractal noises, e.g., realisations f of fractional Brownian motion (fBm) parameterised by Hurst exponent 0�H�1, this resampling algorithm exactly preserves waveletbased estimates of the second order stochastic properties of the (possibly nonstationary) time series. Performance of the method is assessed empirically using 1like noise simulated by f
A Wavelet Based Joint Estimator of the Parameters of LongRange Dependence.
, 1998
"... A joint estimator is presented for the two parameters that define the longrange dependence phenomenon in the simplest case. The estimator is based on the coefficients of a discrete wavelet decomposition, improving a recently proposed waveletbased estimator of the scaling parameter [4], as well as ..."
Abstract

Cited by 60 (10 self)
 Add to MetaCart
A joint estimator is presented for the two parameters that define the longrange dependence phenomenon in the simplest case. The estimator is based on the coefficients of a discrete wavelet decomposition, improving a recently proposed waveletbased estimator of the scaling parameter [4], as well as extending it to include the associated power parameter. An important feature is its conceptual and practical simplicity, consisting essentially in measuring the slope and the intercept of a linear fit after a discrete wavelet transform is performed, a very fast (O(n)) operation. Under well justified technical idealisations the estimator is shown to be unbiased and of minimum or close to minimum variance for the scale parameter, and asymptotically unbiased and efficient for the second parameter. Through theoretical arguments and numerical simulations it is shown that in practice, even for small data sets, the bias is very small and the variance close to optimal for both parameters. Closed for...
Fast parametric elastic image registration
 IEEE Transactions on Image Processing
, 2003
"... Abstract—We present an algorithm for fast elastic multidimensional intensitybased image registration with a parametric model of the deformation. It is fully automatic in its default mode of operation. In the case of hard realworld problems, it is capable of accepting expert hints in the form of so ..."
Abstract

Cited by 60 (4 self)
 Add to MetaCart
Abstract—We present an algorithm for fast elastic multidimensional intensitybased image registration with a parametric model of the deformation. It is fully automatic in its default mode of operation. In the case of hard realworld problems, it is capable of accepting expert hints in the form of soft landmark constraints. Much fewer landmarks are needed and the results are far superior compared to pure landmark registration. Particular attention has been paid to the factors influencing the speed of this algorithm. The Bspline deformation model is shown to be computationally more efficient than other alternatives. The algorithm has been successfully used for several twodimensional (2D) and threedimensional (3D) registration tasks in the medical domain, involving MRI, SPECT, CT, and ultrasound image modalities. We also present experiments in a controlled environment, permitting an exact evaluation of the registration accuracy. Test deformations are generated automatically using a random hierarchical fractional waveletbased generator. Index Terms—Elastic registration, image registration, landmarks, splines. I.
Fast, Approximate Synthesis of Fractional Gaussian Noise for Generating SelfSimilar Network Traffic
 ACM SIGCOMM, Computer Communication Review
, 1997
"... Recent network traffic studies argue that network arrival processes are much more faithfully modeled using statistically selfsimilar processes instead of traditional Poisson processes [LTWW94, PF95]. One difficulty in dealing with selfsimilar models is how to efficiently synthesize traces (sample p ..."
Abstract

Cited by 58 (2 self)
 Add to MetaCart
Recent network traffic studies argue that network arrival processes are much more faithfully modeled using statistically selfsimilar processes instead of traditional Poisson processes [LTWW94, PF95]. One difficulty in dealing with selfsimilar models is how to efficiently synthesize traces (sample paths) corresponding to selfsimilar traffic. We present a fast Fourier transform method for synthesizing approximate selfsimilar sample paths for one type of selfsimilar process, Fractional Gaussian Noise, and assess its performance and validity. We find that the method is as fast or faster than existing methods and appears to generate close approximations to true selfsimilar sample paths. We also discuss issues in using such synthesized sample paths for simulating network traffic, and how an approximation used by our method can dramatically speed up evaluation of Whittle's estimator for H, the Hurst parameter giving the strength of longrange dependence present in a selfsimilar time series. 1
SelfSimilarity and LongRange Dependence Through the Wavelet Lens
, 2000
"... Selfsimilar and longrange dependent processes are the most important kinds of random processes possessing scale invariance. We describe how to analyze them using the discrete wavelet transform. We have chosen a didactic approach, useful to practitioners. Focusing on the Discrete Wavelet Transform, ..."
Abstract

Cited by 43 (7 self)
 Add to MetaCart
Selfsimilar and longrange dependent processes are the most important kinds of random processes possessing scale invariance. We describe how to analyze them using the discrete wavelet transform. We have chosen a didactic approach, useful to practitioners. Focusing on the Discrete Wavelet Transform, we describe the nature of the wavelet coefficients and their statistical properties. Pitfalls in understanding and key features are highlighted and we sketch some proofs to provide additional insight. The Logscale Diagram is introduced as a natural means to study scaling data and we show how it can be used to obtain unbiased semiparametric estimates of the scaling exponent. We then focus on the case of longrange dependence and address the problem of defining a lower cutoff scale corresponding to where scaling starts. We also discuss some related problems arising from the application of wavelet analysis to discrete time series. Numerical examples using many discrete time models are th...
A WaveletBased Analysis of Fractal Image Compression
 IEEE Trans. Image Processing
, 1997
"... Why does fractal image compression work? What is the implicit image model underlying fractal block coding? How can we characterize the types of images for which fractal block coders will work well? These are the central issues we address. We introduce a new waveletbased framework for analyzing block ..."
Abstract

Cited by 42 (2 self)
 Add to MetaCart
Why does fractal image compression work? What is the implicit image model underlying fractal block coding? How can we characterize the types of images for which fractal block coders will work well? These are the central issues we address. We introduce a new waveletbased framework for analyzing blockbased fractal compression schemes. Within this framework we are able to draw upon insights from the wellestablished transform coder paradigm in order to address the issue of why fractal block coders work. We show that fractal block coders of the form introduced by Jacquin[1] are a Haar wavelet subtree quantization scheme. We examine a generalization of this scheme to smooth wavelets with additional vanishing moments. The performance of our generalized coder is comparable to the best results in the literature for a Jacquinstyle coding scheme. Our wavelet framework gives new insight into the convergence properties of fractal block coders, and leads us to develop an unconditionally convergen...
LongRange Dependence: revisiting Aggregation with Wavelets.
 Journal of Time Series Analysis
, 1998
"... The aggregation procedure is a natural way to analyse signals which exhibit longrange dependent features and has been used as a basis for estimation of the Hurst parameter, H. In this paper it is shown how aggregation can be naturally rephrased within the wavelet transform framework, being directly ..."
Abstract

Cited by 40 (12 self)
 Add to MetaCart
The aggregation procedure is a natural way to analyse signals which exhibit longrange dependent features and has been used as a basis for estimation of the Hurst parameter, H. In this paper it is shown how aggregation can be naturally rephrased within the wavelet transform framework, being directly related to approximations of the signal in the sense of a Haarmultiresolution analysis. A natural wavelet based generalisation to traditional aggregation is then proposed: "aaggregation". It is shown that aaggregation cannot lead to good estimators of H, and so a new kind of aggregation, "daggregation", is defined, which is related to the details rather than the approximations of a multiresolution analysis. An estimator of H based on daggregation has excellent statistical and computational properties, whilst preserving the spirit of aggregation. The estimator is applied to telecommunications network data.
On estimation of the wavelet variance
 Biometrika
, 1995
"... The wavelet variance provides a scalebased decomposition of the process variance for a time series or random field. It has seen increasing use in geophysics, astronomy, genetics, hydrology, medical imaging, oceanography, soil science, signal processing and texture analysis. In practice, however, da ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
The wavelet variance provides a scalebased decomposition of the process variance for a time series or random field. It has seen increasing use in geophysics, astronomy, genetics, hydrology, medical imaging, oceanography, soil science, signal processing and texture analysis. In practice, however, data collected in the form of a time series or random field often suffer from various types of contamination. We discuss the difficulties and limitations of existing contamination models (pure replacement models, additive outliers, level shift models and innovation outliers that hide themselves in the original time series) for robust nonparametric estimates of secondorder statistics. We then introduce a new model based upon the idea of scalebased multiplicative contamination. This model supposes that contamination can occur and affect data at certain scales and thus arises naturally in multiscale processes and in the wavelet variance context. For this new contamination model, we develop a full Mestimation theory for the wavelet variance and derive its large sample theory when the underlying time series or random field is Gaussian. Our approach treats the wavelet variance as a scale parameter and offers protection against contamination that operates additively on the log of squared wavelet coefficients and acts independently at different scales.
A Statistical Test for the Time Constancy of Scaling Exponents
 IEEE Transactions on Signal Processing
, 1999
"... A wavelet based statistical test is described for distinguishing true time variation of the scaling exponent describing scaling behaviour, from statistical fluctuations of estimates across time of a constant exponent. The test is applicable to diverse scaling phenomena including long range dependenc ..."
Abstract

Cited by 34 (8 self)
 Add to MetaCart
A wavelet based statistical test is described for distinguishing true time variation of the scaling exponent describing scaling behaviour, from statistical fluctuations of estimates across time of a constant exponent. The test is applicable to diverse scaling phenomena including long range dependence and exactly selfsimilar processes in a uniform framework, without the need for prior knowledge of the type in question. It is based on the special properties of waveletbased estimates of the scaling exponent over adjacent blocks of data, strongly motivating an idealised inference problem: the equality or otherwise of means of independent Gaussian variables with known variances. A uniformly most powerful invariant test exists for this problem and is described. A separate UMPI test is also described for when the scaling exponent undergoes a level change. The power functions of the two tests are given explicitly and compared. Using simulation the effect in practice of deviations from the ide...