## Nonparametric Neural Network Estimation of Lyapunov Exponents and a Direct Test for Chaos (2000)

Citations: | 10 - 2 self |

### BibTeX

@MISC{Shintani00nonparametricneural,

author = {Mototsugu Shintani and Oliver Linton},

title = {Nonparametric Neural Network Estimation of Lyapunov Exponents and a Direct Test for Chaos},

year = {2000}

}

### OpenURL

### Abstract

This paper derives the asymptotic distribution of the nonparametric neural network estimator of the Lyapunov exponent in a noisy system. Positivity of the Lyapunov exponent is an operational definition of chaos. We introduce a statistical framework for testing the chaotic hypothesis based on the estimated Lyapunov exponents and a consistent variance estimator. A simulation study to evaluate small sample performance is reported. We also apply our procedures to daily stock return data. In most cases, the hypothesis of chaos in the stock return series is rejected at the 1 % level with an exception in some higher power transformed absolute returns.

### Citations

811 |
Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation
- Andrews
- 1991
(Show Context)
Citation Context ...estimation of ©. Since ´ t’s are serially dependent and not identically distributed, we need to employ a heteroskedasticity and autocorrelation consistent (HAC) covariance matrix estimator (see, e.g. =-=Andrews, 1991-=-) for ©. For the one-dimensional case, the covariance estimator b © is de…ned as follows: b© = M X¡1 j=¡M +1 w(j=SM )b°(j) and b°(j) = 1 M MX t=jjj+1 b´ t b´ t¡jjj ; n ¯ where b´ t = ln ¯Db µ(Xt¡1) ¯ ... |

397 | Universal Approximation Bounds for Superpositions of a Sigmoidal Function - Barron - 1993 |

397 | Ergodic theory of chaos and strange .attractors - Eckmann, Ruelle - 1985 |

394 | A long memory property of stock market returns and a new model - Ding, Granger, et al. - 1993 |

381 | Modeling Financial Time Series - Taylor - 1986 |

220 | Heterogeneous beliefs and routes to chaos in a simple asset pricing model - Brock, Hommes - 1998 |

173 | Abstract Inference - Grenander - 1981 |

120 | Products of random matrices - FURSTENBERG, KESTEN - 1960 |

94 | Artificial Neural Networks: An Econometric Perspective.” Econometric Review - Kuan, White - 1994 |

82 | A Test for Independence Based on the Correlation Dimension - Brock, Dechert, et al. - 1996 |

74 | Convergence rate of sieve estimates - Shen, Wong - 1994 |

70 | On Learning the Derivatives of an Unknown Mapping With Multilayer Feedforward Networks”, Neural Networks - GALLANT, WHITE - 1992 |

68 | A test for independence based on the correlation dimension, Econometric Reviews - Brock, Dechert, et al. - 1996 |

67 |
Sieve extremum estimation for weakly dependent data
- Chen, Shen
- 1998
(Show Context)
Citation Context ...: ± ¡2 criterion di¤erences induced by µ 2 £T Z ± ± 2 ´´ max ±T ; kµ0 ¡ T µ0k2;2 [H("; FT )] 1=2 d" · const: £ n 1=2 metric entropy with bracketing which controls the size of the space of (See =-=Chen and Shen, 1998-=-, for the de…nition. Formally, the bracketing L2 metric entropy of the space of the L2 measurable functions indexed by £T by FT given = fh(µ; z) = l(µ; z) ¡ l(µ0; z) : µ 2 £T g is de…ned as follows. F... |

54 | Finding chaos in noisy systems - Nychka, Ellner, et al. - 1992 |

50 | Lyapunov exponents from time series - Eckmann - 1986 |

49 | On Methods of Sieves and Penalization - Shen - 1997 |

44 | A single-blind controlled competition among test for nonlinearity and chaos - Barnett, Gallant, et al. - 1997 |

42 |
Estimating the Dimension of a Model,” The annals of Statistics 6(2
- Schwarz
- 1978
(Show Context)
Citation Context ...yapunov exponent. For this subsection and the empirical part of this paper, the number of lag length (d) and the number of hidden units (r) will be jointly determined by minimizing the BIC criterion (=-=Schwarz, 1978-=-) de…ned by BIC(d; r) = ln b 2 ln T + [1 + r(d + 2)] T where b 2 = T ¡1 P T t=1 Xt ¡ b ´ 2 µ(Xt¡1; : : : ; Xt¡d) . For the HAC estimation required for the standard error, we employ the QS kernel w... |

34 | Degree of approximation results for feedforward networks approximating unknown mappings and their derivatives, Neural Computation 6
- Stinchcombe, White, et al.
- 1994
(Show Context)
Citation Context ...proximation properties of the neural networks. For this reason, we employ the condition of Hornik et al. (1994) and allow a more general class of functions for our activation function. Assumption B2 (=-=Hornik et al., 1994-=-). function satisfying Ã 2 B 2 1 The activation function Ã is a possibly nonsigmoid and is k-finite for some k ¸ 2, namely, Z 0 < jD k Ã(u)jdu < 1: R We can replace the condition above by the stronger... |

30 | Estimating the Lyapunov exponent of a chaotic system with nonparametric regression - McCaffrey, Ellner, et al. - 1992 |

29 | Improved Rates and Asymptotic Normality for Nonparametric Neural Network Estimators
- Chen, While
- 1999
(Show Context)
Citation Context ...¯ sup ¯¢ z2Z b µ(z) ¡ ¢µ0(z) ¯ = Op([T= log T ] ¡1=4 ): In order to obtain the improved rate for the derivative estimator, we introduce a Hölder condition on the activation function. 9 Assumption B3 (=-=Chen and White, 1999-=-). For any (a 0 ; b); (a 0 1 ; b1) 2 R d £ R, there exists an ® 2 (0; 1] associated with Ã 2 B 3 1 ° ° Ãa;b ¡ Ãa1;b1 ° B 3 1 such that for all z in the compact support S, · const: £ h ((a ¡ a1) 0 (a ¡... |

28 | Random approximants and neural networks - Makovoz - 1996 |

27 | An algorithm for the n Lyapunov exponents of an n-dimensional unknown dynamical system - Gencay, Dechert - 1992 |

26 | FUNFITS: Data analysis and statistical tools for estimating functions - Nychka, Bailey, et al. - 1996 |

19 | Convergence rates and data requirements for Jacobian-based estimates of Liapunov exponents from data, Physics Letters A 153 - Ellner, Gallant, et al. - 1991 |

17 |
Lyapunov Exponents as a Nonparametric Diagnostic for Stability Analysis
- Dechert, Gençay
- 1992
(Show Context)
Citation Context ...rtant information related to the stability of the system, including the directions of divergence and contraction of trajectories (see Nychka et al., 1992) and the types of non-chaotic attractors (see =-=Dechert and Gençay, 1992-=-). In the following subsection, we introduce the single hidden layer networks to obtain the nonparametric estimates of (1) and (4). 2.2 Nonparametric neural network estimation Let b µ(¢) be the estima... |

16 | Uncovering nonlinear structure in realtime stock market indexes - Abhyankar, Copeland, et al. - 1997 |

13 | Robustness of nonlinearity and chaos tests to measurement error, inference method, and sample size - Barnett, Gallant, et al. - 1995 |

13 | A statistical framework for testing chaotic dynamics via Lyapunov exponents - Gencay - 1996 |

12 | Semiparametric ARX Neural Network Models with an Application to Forecasting Inflation. Neural Networks 12 - Chen, Racine, et al. - 2001 |

11 | Estimating Lyapunov exponents with nonparametric regression - McCaffrey, Nychka, et al. - 1992 |

10 | Random Walks, Breaking Trend Functions, and the Chaotic Structure of the Velocity of Money - Serletis - 1995 |

8 | Local Lyapunov exponents: predictability depends on where you are. Nonlinear Dynamics and Economics, Kirman et al. Eds - Bailey - 1997 |

5 | An invariance principle for weakly dependent sequences of random variables,” Annals of Probability 12 - Herrndorf - 1984 |

5 | Is there chaos in the world economy? A nonparametric test using consistent standard errors - Shintani, Linton - 2003 |

3 | Characterizing the degree of stability of non-linear dynamic models - Bask, Luna - 2002 |

3 |
Convergence rate of sieve estimates.” Annals of Statistics 22(2
- Shen, Wong
- 1994
(Show Context)
Citation Context ...ult to the multidimensional case. For d = 1, we denote Z = Â and our goal is to obtain the convergence rate for sup x2Â ¯ ¯Db µ(x) ¡ Dµ0(x) Note that interpolation inequality (See Gabushin, 1967, and =-=Shen and Wong, 1994-=-) implies ¯ : kg(x) ¡ g0(x)k 1 · K kg(x) ¡ g0(x)k 2(m¡1)=2m kD m g(x) ¡ D m g0(x)k 1=2m : where K is a …xed constant. Substituting g(x) = D b µ(x), g0(x) = Dµ0(x), m = 1 yields ° °Db ° µ(x) ¡ Dµ0(x) °... |

2 |
Inequalities for the norms of a function and its derivatives in metric Lp.” Matematicheskie Zametki 1(3
- Gabushin
- 1967
(Show Context)
Citation Context ... then extend the result to the multidimensional case. For d = 1, we denote Z = Â and our goal is to obtain the convergence rate for sup x2Â ¯ ¯Db µ(x) ¡ Dµ0(x) Note that interpolation inequality (See =-=Gabushin, 1967-=-, and Shen and Wong, 1994) implies ¯ : kg(x) ¡ g0(x)k 1 · K kg(x) ¡ g0(x)k 2(m¡1)=2m kD m g(x) ¡ D m g0(x)k 1=2m : where K is a …xed constant. Substituting g(x) = D b µ(x), g0(x) = Dµ0(x), m = 1 yield... |

2 | The identiÞcation of spurious Lyapunov exponents in Jacobian algorithms - Gençay, Dechert - 1996 |

2 | New resampling method to assess the accuracy of the maximal Lyapunov exponent estimation - Giannerini, Rosa - 2001 |

1 | Liapunov exponents from time series.” Physical Review A 34 - P, Kamphorst, et al. - 1986 |

1 | Estimating Lyapunov exponents with nonparametric regression and convergence rates for feedforward single hidden layer networks - McCa¤rey - 1991 |