Results 1 
3 of
3
Denoising Nonlinear Time Series by Adaptive Filtering and Wavelet Shrinkage: A Comparison
"... Abstract—Time series measured in real world is often nonlinear, even chaotic. To effectively extract desired information from measured time series, it is important to preprocess data to reduce noise. In this Letter, we propose an adaptive denoising algorithm. Using chaotic Lorenz data and calculatin ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
Abstract—Time series measured in real world is often nonlinear, even chaotic. To effectively extract desired information from measured time series, it is important to preprocess data to reduce noise. In this Letter, we propose an adaptive denoising algorithm. Using chaotic Lorenz data and calculating rootmeansquareerror, Lyapunov exponent, and correlation dimension, we show that our adaptive algorithm more effectively reduces noise in the chaotic Lorenz system than wavelet denoising with three different thresholding choices. We further analyze an electroencephalogram (EEG) signal in sleep apnea and show that the adaptive algorithm again more effectively reduces the Electrocardiogram (ECG) and other types of noise contaminated in EEG than wavelet approaches. Index Terms—Adaptive denoising algorithm, EEG signal, Lorenz, wavelet.
CONSISTENCY OF SUPPORT VECTOR MACHINES FOR FORECASTING THE EVOLUTION OF AN UNKNOWN ERGODIC DYNAMICAL SYSTEM FROM OBSERVATIONS WITH UNKNOWN NOISE
, 2007
"... We consider the problem of forecasting the next (observable) state of an unknown ergodic dynamical system from a noisy observation of the present state. Our main result shows, for example, that support vector machines (SVMs) using Gaussian RBF kernels can learn the best forecaster from a sequence of ..."
Abstract
 Add to MetaCart
We consider the problem of forecasting the next (observable) state of an unknown ergodic dynamical system from a noisy observation of the present state. Our main result shows, for example, that support vector machines (SVMs) using Gaussian RBF kernels can learn the best forecaster from a sequence of noisy observations if (a) the unknown observational noise process is bounded and has a summable αmixing rate and (b) the unknown ergodic dynamical system is defined by a Lipschitz continuous function on some compact subset of R d and has a summable decay of correlations for Lipschitz continuous functions. In order to prove this result we first establish a general consistency result for SVMs and all stochastic processes that satisfy a mixing notion that is substantially weaker than αmixing. Let us assume that we have an ergodic dynamical system described by the sequence (F n)n≥0 of iterates of an (essentially) unknown map F:M → M, where M ⊂ R d is compact and the corresponding ergodic measure µ is assumed to be unique. Furthermore, assume that all observations ˜x of this dynamical system are corrupted by some stationary, R dvalued, additive noise process E = (εn)n≥0 whose distribution ν we assume to be independent of the state, but otherwise unknown, too. In other words all possible observations of the system at time n ≥ 0 are of the form (1) ˜xn = F n (x0) + εn, where x0 is a true but unknown state at time 0. Now, given an observation of the system at some arbitrary time, our goal is to forecast the next observable