### Citations

5729 |
Neural networks : a comprehensive foundation
- Haykin
- 1999
(Show Context)
Citation Context ...f the system, tosuse them when prediction of the time series is taking place.sTraining of all SRNN is performed using the algorithm “Real-time reallearning based on extended Kalman filter (RTRL-EKF)” =-=[16]-=-. This algorithm contains 2 parts: gradient estimation and weights adjustment. The first part is done using the Real-Time, Real-Learning Algorithm proposed by Williams and Zipsers[32]; second part is ... |

3446 | A theory for multiresolution signal decomposition: the wavelet representation,
- Mallat
- 1989
(Show Context)
Citation Context ...dimensional representation: time and frequency [1].sIn this work, a multi-scale decomposition ofsthe training signal is performed using the sub-band coding algorithm of the Discrete Wavelet Transform =-=[22]-=-. This algorithm uses a filter bank to analyze a discrete signal x(t). This bank is made of low-pass L(z) and high-pass H(z) filters,sseparating frequency content of the input signal in spectral bands... |

745 |
Nonlinear time series analysis
- Kantz, Schreiber
- 2003
(Show Context)
Citation Context ...d the “reduced set” in such competition. In order to determine if these series were chaotic, the maximum Lyapunov Exponent (LE) of eachsone was calculated using the method proposed by Sano and Sawada =-=[18]-=-. Table 1sshows the maximum LE of each time series; notice that all are positive, an indication of chaos.sFig. 5. 735 points of the time series sumsin()sTable 1. Maximum LE of reduced set series NN5 [... |

524 | A learning algorithm for continually running fully recurrent neural networks
- Williams, Zipser
- 1989
(Show Context)
Citation Context ...lter (RTRL-EKF)” [16]. This algorithm contains 2 parts: gradient estimation and weights adjustment. The first part is done using the Real-Time, Real-Learning Algorithm proposed by Williams and Zipsers=-=[32]-=-; second part is done using an extended Kalman Filter. RTRL-EKF has ascomplexity of O(n4), where n is the number of neurons in the neural network [12].s2.3sPhase 3: Training the HWRNsAfter training al... |

248 |
Forecasting: methods and applications
- Makridakis, Wheelwright, et al.
- 1978
(Show Context)
Citation Context ...techniquessused to build predictors; they may be linear or non-linear, statistical or based onscomputational or artificial intelligence.sFor example, ARMA, ARIMA and Kalman filters are linear methods =-=[21]-=-; k-nearest neighbors, genetic algorithms andsartificial neural networks are examples of non-linear methods. Only non-linearsmethods are useful to forecast non-linear time series.sThe use of fully-con... |

231 | Wavelet networks
- Zhang, Benveniste
- 1992
(Show Context)
Citation Context ... 30,31] among others. From the vast number of strategiessused to improve the long term prediction ability of neural networks, Wavelet Theory is used either to modify neuron architectures (for example =-=[2,6,12,31,33]-=-) orsas a pre-processing aid applied to training data (for example [11,26,28,29]). Whenswavelet theory is used to modify the neuron architecture, normally it is done usingsComposite Recurrent Neural N... |

141 |
The Illustrated Wavelet Transform Handbook, Institute of Physics Publishing
- Addison
- 2002
(Show Context)
Citation Context ...e series with enough information of the dynamical behaviorsin order to be trained. Such time series may contain integer or real values and thesmagnitude of each element must be scaled to the interval =-=[0,1]-=-. This is required insorder to use sigmoid transfer functions for the nodes in the network. To achievesthis, the time series may be normalized or linearly scaled; in this research a linearsscale trans... |

74 |
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
- Mandic, Chambers
- 2001
(Show Context)
Citation Context ...ethods are useful to forecast non-linear time series.sThe use of fully-connected, recurrent neural networks for long-term predictionsof highly-dynamical or chaotic time series has been deeply studied =-=[23]-=-. In spite ofsthe powerful capabilities of these models to represent dynamical systems, theirspractical use is still limited, due to constraints found in defining an optimal number of hidden nodes for... |

34 | Dynamic recurrent neural networks
- Pearlmutter
- 1990
(Show Context)
Citation Context ...igmoid for allslayers except output layer, for which transfer function is linear.s256 P. Gomez-Gil, A. Garcia-Pedrero, and J.M. Ramirez-Cortes In order to be solved, equation 3 may be approximated as =-=[27]-=-:s)())(()()1()( ttItxttyttty iiii Δ+Δ+Δ−=Δ+ σs(5)sfor a small tΔ , where:s∑ = = m j jiji wtytx 1 )()(s(6)sFor the results reported here, initial conditions of each node )0( =tyi , are set asssmall ran... |

21 |
Time-series prediction using a local linear wavelet neural network
- Chen, Yang, et al.
(Show Context)
Citation Context ... 30,31] among others. From the vast number of strategiessused to improve the long term prediction ability of neural networks, Wavelet Theory is used either to modify neuron architectures (for example =-=[2,6,12,31,33]-=-) orsas a pre-processing aid applied to training data (for example [11,26,28,29]). Whenswavelet theory is used to modify the neuron architecture, normally it is done usingsComposite Recurrent Neural N... |

17 | Chaotic time series Part II: System identification and prediction. Modelling, Identification and control
- Lillekjendlie, Kugiumtzis, et al.
- 1994
(Show Context)
Citation Context ...-stationary,sextremely sensitive to initial conditions of the system and contains at least onespositive Lyapunov Exponent [15]. It is claimed that chaotic time series may onlysbe short-term predicted =-=[20]-=-. Even though, in some cases it is possible to approximate a dynamical model with similar characteristics to that found in the nonlinear time series and to use it for long-term prediction. There are m... |

17 |
On the use of the wavelet decomposition for time series prediction
- Soltani
- 2002
(Show Context)
Citation Context ... term prediction ability of neural networks, Wavelet Theory is used either to modify neuron architectures (for example [2,6,12,31,33]) orsas a pre-processing aid applied to training data (for example =-=[11,26,28,29]-=-). Whenswavelet theory is used to modify the neuron architecture, normally it is done usingsComposite Recurrent Neural Networks for Long-Term Predictions255 a wavelet function as the activation functi... |

15 |
D.C.: Time series prediction with recurrent neural networks trained by a hybrid PSO-EA algorithm. Neurocomputing 70(13-15
- Cai, Zhang, et al.
- 2007
(Show Context)
Citation Context ...ionsof the dynamical system. The last layer acts as a function approximator builder.sThe output of each node i at HWRN and SRNN is defined as:siii i Ixy dt dy ++−= )(σs(3)swhere:s∑ = = m j jiji wyx 1s=-=(4)-=-srepresents the inputs to the i-th neuron coming from othersm neurons,sIisis an external input to i-th neuron,sjiwsis the weight connecting neuron i to neuron j,s)(xσ is the node’s transfer function; ... |

14 | Another look at forecast-accuracy metrics for intermittent demand
- Hyndman
- 2006
(Show Context)
Citation Context ...= −= n t tt xx n MSE 1 2)ˆ(1s(8)sThe “Symmetrical-Mean Absolute Percentage Error” is scale-independent; therefore it is frequently used to compare performances when different time series aresinvolved =-=[17]-=-.sThis is the official metric used by the “NN5 forecasting competition for artificial neural networks & computational Intelligence” [8]. SMAPE issdefined as:s%)100( 2/)ˆ( ˆ1 1 ∑ = ⎟⎟⎠ ⎞ ⎜⎜⎝ ⎛ + − = n ... |

10 |
D.: Computational Intelligence in Time Series Forecasting Theory and Engineering Applications
- Palit, Popović
- 2005
(Show Context)
Citation Context ... defined as an ordered sequence of values observed from a measurable phenomena: nxxx ..., 21 ; such observations are sensed at uniform time intervals and may be represented as integer or realsnumbers =-=[25]-=-. Once defined, an approximated model may be used to predict thestrend of the system behavior or to predict as much specific values of the time series as desired. As usual, such model will be just as ... |

5 |
R.: Time series prediction using chaotic neural networks on the cats benchmark. Neurocomputing 70(13-15
- Beliaev, Kozma
- 2007
(Show Context)
Citation Context ...s able to memorize time informationsof the dynamical system. The last layer acts as a function approximator builder.sThe output of each node i at HWRN and SRNN is defined as:siii i Ixy dt dy ++−= )(σs=-=(3)-=-swhere:s∑ = = m j jiji wyx 1s(4)srepresents the inputs to the i-th neuron coming from othersm neurons,sIisis an external input to i-th neuron,sjiwsis the weight connecting neuron i to neuron j,s)(xσ i... |

5 | Spiral Recurrent Neural Network for Online
- Gao, Kriegel
(Show Context)
Citation Context ... term prediction ability of neural networks, Wavelet Theory is used either to modify neuron architectures (for example [2,6,12,31,33]) orsas a pre-processing aid applied to training data (for example =-=[11,26,28,29]-=-). Whenswavelet theory is used to modify the neuron architecture, normally it is done usingsComposite Recurrent Neural Networks for Long-Term Predictions255 a wavelet function as the activation functi... |

5 | Fast bootstrap applied to LSSVM for long term prediction of time series
- Lendasse, Wertz, et al.
- 2004
(Show Context)
Citation Context ...or short-term prediction occurs when several past values are used to predict the next unknown value ofsthe time series. If no exogenous variables are considered, one-step prediction maysbe defined as =-=[19]-=-:s)...,( 211 ptttt xxxx −−−+ = φs(1)swhere φ is a approximation function used to predict. Similarly, long term prediction may be defined as:s)...,(,... 2112 ptttttht xxxxxx −−−+++ = φs(2)swhere h ... |

3 | L.: Lattice Dynamical Wavelet Neural Networks Implemented Using Particle Swarm Optimization for Spatio–Temporal System Identification
- Wei, Billings, et al.
- 2009
(Show Context)
Citation Context ... 30,31] among others. From the vast number of strategiessused to improve the long term prediction ability of neural networks, Wavelet Theory is used either to modify neuron architectures (for example =-=[2,6,12,31,33]-=-) orsas a pre-processing aid applied to training data (for example [11,26,28,29]). Whenswavelet theory is used to modify the neuron architecture, normally it is done usingsComposite Recurrent Neural N... |

2 | F.: Learning and approximation of chaotic time series using wavelet networks
- Alarcon-Aquino, Garcia-Treviño, et al.
- 2005
(Show Context)
Citation Context |

2 |
B.: Short term chaotic time series prediction using symmetric LS-SVM regression
- Espinoza, Suykens, et al.
- 2005
(Show Context)
Citation Context ...out that SMAPE cannot be applied over time series withsnegative values.sOther popular metric is the “Mean Absolute Scaled Error,” defined as:s∑ ∑= = − − − − = n t n i ii tt xx n xx n MASE 1 2 11 1 ˆ1s=-=(10)-=-swhere txsis the original time series and tx̂sis the predicted time series.s4sExperiments and ResultssThe proposed architecture and training scheme were tested using two benchmarkstime series; they ar... |

2 |
M.: Experiments with a Hybrid-Complex Neural Networks for Long Term Prediction of Electrocardiograms
- Gomez-Gil, Ramirez-Cortes
(Show Context)
Citation Context ...eries is of particular interest in this research. A chaotic time series is non-stationary,sextremely sensitive to initial conditions of the system and contains at least onespositive Lyapunov Exponent =-=[15]-=-. It is claimed that chaotic time series may onlysbe short-term predicted [20]. Even though, in some cases it is possible to approximate a dynamical model with similar characteristics to that found in... |

1 |
Cernansky’s homepage downloads (2008), http://www2.fiit.stuba.sk/~cernans/main/download.html (last accessed
- Cernansky
- 2009
(Show Context)
Citation Context ...ing competition for artificial neural networks & computational Intelligence” [8]. The architecture was implemented using Matlab V7.4,sC++, and public libraries for the training algorithm available at =-=[5]-=-.sFor bothscases, four reconstructed signals were generated using DWT with wavelet function Daubechies ‘db10’ available at Matlab. Three of the reconstructed signalsswere selected using the strategy d... |

1 | R.: The impact of preprocessing on support vector regression and neural networks in time series prediction
- Crone, Guajardo, et al.
- 2006
(Show Context)
Citation Context ...he nodes in the network. To achievesthis, the time series may be normalized or linearly scaled; in this research a linearsscale transformation was applied, as recommended for financial time series by =-=[7]-=-.sThe linear transformation is defined as:s)()min()max( )min( lbubxlbz tt − − − += xx xs(7)swhere:sub is the desired upper bound; in this case ub = 1,slb is the desired lower bound; in this case lb = ... |

1 |
NN5 forecasting competition for artificial neural networks & computational Intelligence (2008), http://www.neural-corecasting-competition.com/ (last consulted at
- Crone
- 2009
(Show Context)
Citation Context ...to compare performances when different time series aresinvolved [17].sThis is the official metric used by the “NN5 forecasting competition for artificial neural networks & computational Intelligence” =-=[8]-=-. SMAPE issdefined as:s%)100( 2/)ˆ( ˆ1 1 ∑ = ⎟⎟⎠ ⎞ ⎜⎜⎝ ⎛ + − = n t tt tt xx xx n SMAPEs(9)sIt is important to point out that SMAPE cannot be applied over time series withsnegative values.sOther popula... |

1 |
NN5 forecasting competition results (2009), http://www.neuralforecasting-competition.com/NN5/results.htm (last consulted at
- Crone
- 2009
(Show Context)
Citation Context ...t to the other two models.sThe last 56 values of the series were used as a validation set in order to comparesthe performance of this architecture with respect to the competition results published by =-=[9]-=-.sTable 2 shows the results obtained using recursive prediction of 56 valuess(validation set) by the 12 experiments over series sumsim(); the metric MAPE issnot shown because it is not valid for negat... |

1 |
Arquitectura Neuronal Apoyada en Señales Reconstruidas con Wavelets para predicción de Series de Tiempo Caóticas (A neural architecture supported by wavelet’s reconstructed signals for chaotic time series prediction
- García-Pedrero
- 2009
(Show Context)
Citation Context |

1 | V.: Chaotic time series approximation using iterative wavelet-networks
- García-Treviño, Alarcon-Aquino
(Show Context)
Citation Context ...ed to train such networks. As a way to tackle these problems, complex architectures with a reducedsnumber of connections, better learning abilities and special training strategiesshave been developed =-=[13]-=-; examples of such works are found at [2,3,s4,10,11,13,15,25,26, 29, 30,31] among others. From the vast number of strategiessused to improve the long term prediction ability of neural networks, Wavele... |

1 | Term Prediction, Chaos and Artificial Neural Networks. Where is the meeting point - Gomez-Gil |

1 |
L.: Multiscale BiLinear Recurrent Neural Networks and Their Application to the Long-Term Prediction of Network Traffic
- Dong-Chul, Chung, et al.
- 2006
(Show Context)
Citation Context ... term prediction ability of neural networks, Wavelet Theory is used either to modify neuron architectures (for example [2,6,12,31,33]) orsas a pre-processing aid applied to training data (for example =-=[11,26,28,29]-=-). Whenswavelet theory is used to modify the neuron architecture, normally it is done usingsComposite Recurrent Neural Networks for Long-Term Predictions255 a wavelet function as the activation functi... |

1 |
W.M.: Artificial intelligence in forecasting demands for electricity: an application in optimization of energy resources. Revista Colombiana de Tec-nologías de Avanzada 2(12
- Sarmiento, Villa
- 2008
(Show Context)
Citation Context |

1 | C.: Time series prediction and the wavelet transform - Soltani, Canu, et al. - 1998 |