## Evolving Predictors for Chaotic Time Series (1998)

Venue: | in Proc. SPIE: Application and Science of Computational Intelligence |

Citations: | 7 - 0 self |

### BibTeX

@INPROCEEDINGS{Angeline98evolvingpredictors,

author = {Peter J. Angeline},

title = {Evolving Predictors for Chaotic Time Series},

booktitle = {in Proc. SPIE: Application and Science of Computational Intelligence},

year = {1998},

pages = {170--180}

}

### Years of Citing Articles

### OpenURL

### Abstract

Neural networks are a popular representation for inducing single-step predictors for chaotic times series. For complex time series it is often the case that a large number of hidden units must be used to reliably acquire appropriate predictors. This paper describes an evolutionary method that evolves a class of dynamic systems with a form similar to neural networks but requiring fewer computational units. Results for experiments on two popular chaotic times series are described and the current method's performance is shown to compare favorably with using larger neural networks. Keywords: evolutionary computation, evolutionary programming, genetic programming, neural networks, chaotic time series prediction 1. INTRODUCTION What once were thought to be random, unpredictable sequences in science, technology, and nature are now newly identified as complex but yet deterministic and consequently predictable. Chaos, the science of non-linear systems, has provided new tools and understandin...

### Citations

661 |
Evolutionary Computation: Toward a New Philosophy of
- Fogel
- 1995
(Show Context)
Citation Context ...tem of equations. Evolutionary programming is a form of evolutionary computation. Evolutionary computations can be succinctly described using the following equation: T1 x' = msxfx ( ( , () ) ) T0 (5) =-=(6)-=-swhere x is a vector of candidate solutions called the population, f is a evaluation function, termed the fitness functions, that returns a vector of values corresponding to the fitness of each elemen... |

224 | An evolutionary algorithm that constructs recurrent neural networks
- Angeline, Saunders, et al.
- 1993
(Show Context)
Citation Context ...he described method to induce single-step predictors using comparatively few units. The first experiment modeled the familiar sunspot data while the N T --------------- 1 vx ( i,, Tk) T ˆ ( – ) 2 ==-= ∑ (7)-=-sSunspots 200 180 160 140 120 100 80 60 40 20 0 0 50 100 150 200 250 300 Time (Years) second investigated a chaotic times series previously used in a time series prediction competition. These time ser... |

133 | Genetic programming and emergent intelligence
- Angeline
- 1994
(Show Context)
Citation Context ... the year 1700, when observations began, and ending in 1993.The dashed line marks the division between the training and test sets. Σ = { +/-* ,,,, ℜ} ι = { d1, d2, d4, d8} κ = { T1, T2, T3} θ = =-={ T0} (8)-=-s250 200 150 100 50 300 250 200 150 100 0 0 50 100 150 200 250 300 350 400 450 500 50 data and three times in the test data. Following Zhang et al. 14 , the even points from the original time series w... |

59 |
Fractal market analysis: applying chaos theory to investment and economics
- Peters
- 1994
(Show Context)
Citation Context ...ctions appropriate for the Additional author information: Email: angeline@natural-selection.com WWW: http://www.natural-selection.com ∑ αi = σ ωjiαj + βi j σ() x ( ) 1 1 e x – = ------------=-=--- + (1) (2)-=-stask at hand should minimize the number of nodes, hidden and input, required to adequately model a given function including chaotic times series. Evolutionary computations 6 , search and optimization... |

46 | Back Propagation, weight-elimination and Time Series Prediction - Weigend, Rumelhart, et al. - 1991 |

35 |
Multilayed feedforward networks are universal approximators", Neural Networks 2
- Hornick, Stinchcombe, et al.
- 1989
(Show Context)
Citation Context ...ith the definition of a feed-forward neural network. A recurrent MIPs net is a MIPs net where the parse tree associated with any variable can refer to any other variable. Formally: Γ( Ti) = Σ∪ ι�=-=�� θ∪κ (3) (-=-4)swhich states that the language used to express how a variable is updated is the union of all defined language elements. This permits a variable’s expression to reference the value of variables th... |

4 |
Ensemble encoding for time series forecasting with MLP networks
- Aerrabotu, Tagliarini, et al.
- 1997
(Show Context)
Citation Context ...or the laser test set and the difference between the predicted value and the actual value for each point in the test set. The evolved network has some difficulty predicting values around the catasd 3 =-=(11)-=-sd1 d2 d3 T4 T3 Figure 7: Neural network connectivity induced by the evolutionary program for predicting values in the laser times series. Circles denote nodes that apply the corresponding activation ... |

3 | On design and evaluation of tapped-delay neural architectures - Svarer, Hansen, et al. - 1992 |

2 | Mühlenbein H., « Evolutionary Induction of Sparse Neural - Zhang, Ohm - 1997 |

1 |
Multiple interacting programs: A hybrid representation for evolving complex behavior”, submitted to
- Angeline
- 1997
(Show Context)
Citation Context ... ten symbols or fewer. All expressions were limited to a maximum of 40 symbols. No restrictions on the organization of the symbols within an expression, other than syntactic correctness, was imposed. =-=(9)-=-sNMSE NMSE 0.6 0.5 0.4 0.3 0.2 0.1 0 0 100 200 300 400 500 600 700 800 1.4 1.2 1 0.8 0.6 0.4 0.2 Generations 4. RESULTS Figure 4a shows the results of the first experiment for evolving a MIPs net for ... |