Results 1  10
of
45
Time Series Prediction by Using a Connectionist Network with Internal Delay Lines
 Time Series Prediction
, 1994
"... A neural network architecture, which models synapses as Finite Impulse Response (FIR) linear filters, is discussed for use in time series prediction. Analysis and methodology are detailed in the context of the Santa Fe Institute Time Series Prediction Competition. Results of the competition show tha ..."
Abstract

Cited by 62 (4 self)
 Add to MetaCart
A neural network architecture, which models synapses as Finite Impulse Response (FIR) linear filters, is discussed for use in time series prediction. Analysis and methodology are detailed in the context of the Santa Fe Institute Time Series Prediction Competition. Results of the competition show that the FIR network performed remarkably well on a chaotic laser intensity time series. 1 Introduction The goal of time series prediction or forecasting can be stated succinctly as follows: given a sequence y(1); y(2); : : : y(N) up to time N , find the continuation y(N + 1); y(N + 2)::: The series may arise from the sampling of a continuous time system, and be either stochastic or deterministic in origin. The standard prediction approach involves constructing an underlying model which gives rise to the observed sequence. In the oldest and most studied method, which dates back to Yule [1], a linear autoregression (AR) is fit to the data: y(k) = T X n=1 a(n)y(k \Gamma n) + e(k) = y(k) + ...
SpatioSpectral Filters for Improving the Classification of Single Trial EEG
 IEEE Trans. Biomed. Eng
, 2005
"... Data recorded in EEG based BrainComputer Interface experiments is generally very noisy, nonstationary and contaminated with artifacts, that can deteriorate discrimination/classification methods. In this work we extend the Common Spatial Pattern (CSP) algorithm with the aim to alleviate these adver ..."
Abstract

Cited by 45 (13 self)
 Add to MetaCart
Data recorded in EEG based BrainComputer Interface experiments is generally very noisy, nonstationary and contaminated with artifacts, that can deteriorate discrimination/classification methods. In this work we extend the Common Spatial Pattern (CSP) algorithm with the aim to alleviate these adverse effects. In particular we suggest an extension of CSP to the state space, which utilizes the method of time delay embedding. As we will show, this allows for individually tuned frequency filters at each electrode position and thus yields an improved and more robust machine learning procedure. The advantages of the proposed method over the original CSP method are verified in terms of an improved information transfer rate (bits per trial) on a set of EEGrecordings from experiments of imagined limb movements.
Computational mechanics: Pattern and prediction, structure and simplicity
 Journal of Statistical Physics
, 1999
"... Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with ..."
Abstract

Cited by 43 (8 self)
 Add to MetaCart
Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with
Equations of motion from a data series
 Complex Systems
, 1987
"... Abstract. Temporal pattern learning, control and prediction, and chaotic data analysis share a common problem: deducing optimal equations of motion from observations of timedependent behavior. Each desires to obtain models of the physical world from limited information. We describe a method to reco ..."
Abstract

Cited by 41 (14 self)
 Add to MetaCart
Abstract. Temporal pattern learning, control and prediction, and chaotic data analysis share a common problem: deducing optimal equations of motion from observations of timedependent behavior. Each desires to obtain models of the physical world from limited information. We describe a method to reconstruct the deterministic portion of the equations of motion directly from a data series. These equations of motion represent a vast reduction of a chaotic data set’s observed complexity to a compact, algorithmic specification. This approach employs an informational measure of model optimality to guide searching through the space of dynamical systems. As corollary results, we indicate how to estimate the minimum embedding dimension, extrinsic noise level, metric entropy, and Lyapunov spectrum. Numerical and experimental applications demonstrate the method’s feasibility and limitations. Extensions to estimating parametrized families of dynamical systems from bifurcation data and to spatial pattern evolution are presented. Applications to predicting chaotic data and the design of forecasting, learning, and control systems, are discussed. 1.
Finite Impulse Response Neural Networks for Autoregressive Time Series Prediction
, 1993
"... A neural network architecture, which models synapses as Finite Impulse Response (FIR) linear filters, is discussed for use in time series prediction. Analysis and methodology are detailed in the context of the Santa Fe Institute Time Series Prediction Competition. Results of the competition show tha ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
A neural network architecture, which models synapses as Finite Impulse Response (FIR) linear filters, is discussed for use in time series prediction. Analysis and methodology are detailed in the context of the Santa Fe Institute Time Series Prediction Competition. Results of the competition show that the FIR network performed remarkably well on a chaotic laser intensity time series. 1 Introduction The goal of time series prediction or forecasting can be stated succinctly as follows: given a sequence y(1); y(2); : : : y(N) up to time N , find the continuation y(N + 1); y(N + 2)::: The series may arise from the sampling of a continuous time system, and be either stochastic or deterministic in origin. The standard prediction approach involves constructing an underlying model which gives rise to the observed sequence. In the oldest and most studied method, which dates back to Yule [1], a linear autoregression (AR) is fit to the data: y(k) = T X n=1 a(n)y(k \Gamma n) + e(k) = y(k) + e...
Some extensions of radial basis functions and their applications in artificial intelligence
 Computers Math. Applic
, 1992
"... In recent years approximation theory has found interesting applications in the elds of Arti cial Intelligence and Computer Science. For instance, a problem that ts very naturally in the framework of approximation theory is the problem of learning to perform a particular task from a set of examples. ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
In recent years approximation theory has found interesting applications in the elds of Arti cial Intelligence and Computer Science. For instance, a problem that ts very naturally in the framework of approximation theory is the problem of learning to perform a particular task from a set of examples. The examples are sparse data points in a
Predicting the Stock Market
, 1998
"... This paper presents a tuturial introduction to predictions of stock time series. The various approaches of technical and fundamental analysis is presented and the prediction problem is formulated as a special case of inductive learning. The problems with performance evaluation of nearrandomwalk pr ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
This paper presents a tuturial introduction to predictions of stock time series. The various approaches of technical and fundamental analysis is presented and the prediction problem is formulated as a special case of inductive learning. The problems with performance evaluation of nearrandomwalk processes are illustrated with examples together with guidelines for avoiding the risk of datasnooping. The connections to concepts like "the biasvariance dilemma", overtraining and model complexity are further covered. Existing benchmarks and testing metrics are surveyed and some new measures are introduced.
Observing Complexity and the Complexity of Observation
, 1993
"... in nonlinear systems. Physica, 7D:16, 1983. [52] J.P. Crutchfield. Reconstructing language hierarchies. In H. A. Atmanspracher and H. Scheingraber, editors, Information Dynamics, page 45, New York, 1991. Plenum. [19] J.P. Crutchfield. Knowledge and meaning ... chaos and complexity. In L. Lam and V ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
in nonlinear systems. Physica, 7D:16, 1983. [52] J.P. Crutchfield. Reconstructing language hierarchies. In H. A. Atmanspracher and H. Scheingraber, editors, Information Dynamics, page 45, New York, 1991. Plenum. [19] J.P. Crutchfield. Knowledge and meaning ... chaos and complexity. In L. Lam and V. Naroditsky, editors, Modeling Complex Phenomena, page 66, Berlin, 1992. SpringerVerlag. [20] J.P. Crutchfield. Semantics and thermodynamics. In M. Casdagli and S. Eubank, editors, Nonlinear Modeling and Forecasting, volume XII of Santa Fe Institute Studies in the Sciences of Complexity, page 317, Reading, Massachusetts, 1992. AddisonWesley. [21] P. E. Caines. Linear Stochastic Systems. Wiley, New York, 1988. [22] B. Kitchens and S. Tuncel. Finitary measures for sub shifts of finite type and sofic systems. Memoirs' of the AMS, 58:no. 338, 1985. [23] I. P. Cornfeld, S. V. Fomin, and Ya. G. Sinai. Ergodic Theory. SpringerVerlag, Berlin, 1982. [24] J. E. Hopcroft and J. D. Ullman. Intr
Reasoning About Sensor Data for Automated System Identification
 In Advances in Intelligent Data
, 1998
"... The computer program pret automatically constructs mathematical models of physical systems. A critical part of this task is automating the processing of sensor data. pret's intelligent data analyzer uses geometric reasoning to infer qualitative information from quantitative data; if critical variabl ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
The computer program pret automatically constructs mathematical models of physical systems. A critical part of this task is automating the processing of sensor data. pret's intelligent data analyzer uses geometric reasoning to infer qualitative information from quantitative data; if critical variables are either unknown or cannot be measured, it uses delaycoordinate embedding to reconstruct the internal dynamics from the external sensor measurements. Successful modeling results for two sensorequipped systems, a driven pendulum and a radiocontrolled car, demonstrate the effectiveness of these techniques.
Unreconstructible At Any Radius
, 1992
"... Modeling pattern data series with cellular automata fails for a wide range of deterministic nonlinear spatial processes. If the latter have finite spatiallylocal memory, reconstructed cellular automata with infinite radius may be required. In some cases, even this is not adequate: an irreducible ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
Modeling pattern data series with cellular automata fails for a wide range of deterministic nonlinear spatial processes. If the latter have finite spatiallylocal memory, reconstructed cellular automata with infinite radius may be required. In some cases, even this is not adequate: an irreducible stochasticity remains on the shortest time scales. The underlying problem is illustrated and quantitatively analyzed using an alternative model class called cellular transducers. Contents List of Figures ........................................... iii Section 1