Results 1  10
of
12
Chaos and Nonlinear Forecastability in Economics and Finance
 Philosophical Transactions of the Royal Society of London
, 1994
"... Both academic and applied researchers studying nancial markets and other economic series have become interested in the topic of chaotic dynamics. The possibility ofchaos in nancial markets opens important questions for both economic theorists as well as nancial market participants. This paper will c ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Both academic and applied researchers studying nancial markets and other economic series have become interested in the topic of chaotic dynamics. The possibility ofchaos in nancial markets opens important questions for both economic theorists as well as nancial market participants. This paper will clarify the empirical evidence for chaos in nancial markets and macroeconomic series. It will also compare these two concepts from a nancial market perspective contrasting the objectives of the practitioner with those of economic researchers. Finally, the paper will speculate on the impact of chaos and nonlinear modeling on future economic research. The author is grateful to the Alfred P. Sloan Foundation and the University of Wisconsin Graduate School for It has now been almost ten years since economists began searching for chaotic dynamics in economic time series. This search has yielded deeper understandings of the dynamics of many di erent series, and has led to the development of several useful tests for nonlinear structure. However, the direct evidence for deterministic chaos in many economic series remains weak. This paper will survey the
Existence and Learning of Oscillations in Recurrent Neural Networks
, 1999
"... In this paper we study a particular class of nnode recurrent neural networks (RNNs). In the 3node case we use monotone dynamical systems theory to show, for a welldefined set of parameters, that, generically, every orbit of the RNN is asymptotic to a periodic orbit. Then, within the usual `learni ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
In this paper we study a particular class of nnode recurrent neural networks (RNNs). In the 3node case we use monotone dynamical systems theory to show, for a welldefined set of parameters, that, generically, every orbit of the RNN is asymptotic to a periodic orbit. Then, within the usual `learning' context of Neural Networks, we investigate whether RNNs of this class can adapt their internal parameters so as to `learn' and then replicate autonomously certain external periodic signals. Our learning algorithm is similar to identification algorithms in adaptive control theory. The main feature of the adaptation algorithm is that global exponential convergence of parameters is guaranteed. We also obtain partial convergence results in the nnode case.
Synchronous Chaos in Highdimensional Modular Neural Networks
 Int. J. Bifurcat. Chaos
, 1996
"... The relationship between certain types of highdimensional neural networks and lowdimensional prototypical equations (neuromodules) is investigated. The highdimensional systems consist of nitely many pools containing identical, dissipative and nonlinear singleunits operating in discrete time. ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
The relationship between certain types of highdimensional neural networks and lowdimensional prototypical equations (neuromodules) is investigated. The highdimensional systems consist of nitely many pools containing identical, dissipative and nonlinear singleunits operating in discrete time. Under the assumption of random connections inside and between pools, the system can be reduced to a set of only a few equations, which  asymptotically in time and system size  describe the behavior of every single unit arbitrarily well. This result can be viewed as synchronization of the single units in each pool. It is stated as a theorem on systems of nonlinear coupled maps, which gives explicit conditions on the single unit dynamics and the nature of the random connections. As an application we compare a 2pool network with the corresponding 2dimensional dynamics. The bifurcation diagrams of both systems become very similar even for moderate system size (N=50) and large disor...
Routes To Chaos In Neural Networks With Random Weights
, 1998
"... this paper we use words like "typical" or "on average" with respect to the space of neural networks we are considering. ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
this paper we use words like "typical" or "on average" with respect to the space of neural networks we are considering.
Random Recurrent Neural Networks for Autonomous System Design
 SAB 2000,Paris
, 2000
"... In this article, we stress the need for using dynamical systems properties in autonomous architecture design. We first study the dynamics of random recurrent neural networks (RRNN). Such systems are known to spontaneously exhibits various dynamical regimes, as they always tries to remain on an ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In this article, we stress the need for using dynamical systems properties in autonomous architecture design. We first study the dynamics of random recurrent neural networks (RRNN). Such systems are known to spontaneously exhibits various dynamical regimes, as they always tries to remain on an attractor, thus achieving stable dynamical behaviors. Second, we try to characterize the adaptive properties of such a system in an open environment, i.e. in a system which always interacts with external signals.Under these conditions, a change in the behavior corresponds to the switch from one attractor to another one. Such bifurcation occur for very little changes in the environment signal; our system is thus unstable on its inputs. We propose a local Hebbian learning rule which tends to stabilize the response of the system for given inputs. After training, the system is able to perform recognition, i.e to produce a specific regular cyclic attractor while the learned input ...
Probability of a local bifurcation type from a fixed point: A random matrix perspective. Submitted: http://arxiv.org/abs/nlin.CD/0510060
"... Results regarding probable bifurcations from fixed points are presented in the context of general dynamical systems (real, random matrices), timedelay dynamical systems (companion matrices), and a set of mappings known for their properties as universal approximators (neural networks). The eigenvalu ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Results regarding probable bifurcations from fixed points are presented in the context of general dynamical systems (real, random matrices), timedelay dynamical systems (companion matrices), and a set of mappings known for their properties as universal approximators (neural networks). The eigenvalue spectrum is considered both numerically and analytically using previous work of Edelman et al. Based upon the numerical evidence, various conjectures are presented. The conclusion is that in many circumstances, most bifurcations from fixed points of large dynamical systems will be due to complex eigenvalues. Nevertheless, surprising situations are presented for which the aforementioned conclusion does not hold, e.g., real random matrices with Gaussian elements with a large positive mean and finite variance.
On the Probability of Chaos in Large Dynamical Systems: A Monte Carlo Study
"... In this paper we report the result of a Monte Carlo study on the probability of chaos in large dynamical systems. We use neural networks as the basis functions for the system dynamics and choose parameter values for the networks randomly. Our results show that as the dimension of the system and the ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper we report the result of a Monte Carlo study on the probability of chaos in large dynamical systems. We use neural networks as the basis functions for the system dynamics and choose parameter values for the networks randomly. Our results show that as the dimension of the system and the complexity of the network increase, the probability of chaotic dynamics increases to 100%. Since neural networks are dense in the set of dynamical systems, our conclusion is that most large systems are chaotic.
On the probability of chaos in large dynamical systems: A Monte Carlo study
"... In this paper we report the result of a Monte Carlo study on the probability of chaos in large dynamical systems. We use neural networks as the basis functions for the system dynamics and choose parameter values for the networks randomly. Our results show that as the dimension of the system and the ..."
Abstract
 Add to MetaCart
In this paper we report the result of a Monte Carlo study on the probability of chaos in large dynamical systems. We use neural networks as the basis functions for the system dynamics and choose parameter values for the networks randomly. Our results show that as the dimension of the system and the complexity of the network increase, the probability of chaotic dynamics increases to 100%. Since neural networks are dense in the set of dynamical systems, our conclusion is that most large systems are chaotic. # 1999 Elsevier Science B.V. All rights reserved. Keywords: Lyapunov exponents; Frequency of chaos; Neural networks 1. Introduction In Brock (1993) there is an argument that the larger the dimension of a nonlinear dynamical system, the larger the probability that the system dynamics have a positive Lyapunov exponent. (Clearly this argument can only hold in a probabilistically generic sense.) From this the conclusion is drawn that 2 it is not absurd to expect that the chances are ...