Results 1 
9 of
9
Parameter Space Structure of ContinuousTime Recurrent Neural Networks
, 2006
"... this article (see Figure 1). By transforming equation 2.1 to the output space defined by o #) and setting the time derivative to 0, we find that the SSIO curve of a neuron with selfweight w is given by I # 1 (o) w o. A single additive model neuron can exhibit either unistable or bista ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
this article (see Figure 1). By transforming equation 2.1 to the output space defined by o #) and setting the time derivative to 0, we find that the SSIO curve of a neuron with selfweight w is given by I # 1 (o) w o. A single additive model neuron can exhibit either unistable or bistable dynamics, depending on the strength of its selfweight and its net input (Cowan & Ermentrout, 1978). In a single CTRNN neuron, only unistable dynamics are possible when w<4 (see Figure 1A). When w>4, bistable dynamics occurs when I L (w) I R (w) (see 42 0 2 4 I+ 0 0.2 0.4 0.6 0.8 1 42 0 2 4 I+ 0 0.2 0.4 0.6 0.8 1 Figure 1: Representative steadystate inputoutput (SSIO) diagrams of a single CTRNN for (A) w 2 and (B) w 8. The solid line shows the output space location of the neuron's equilibrium points as a function of the net input I # . Note that the SSIO becomes folded for w>4, indicating the existence of three equilibrium points. When the SSIO is folded, the left and right edges of the fold are given by I L (w)andI R (w), respectively (black points in B). The ranges of synaptic inputs received from other neurons are indicated by gray rectangles. The lower (min and upper (max limits of this range play an important role in the analysis described in this article. In both plots, two synaptic input ranges are shown: one for which the neuron is saturated off (left rectangle) and one for which the neuron is saturated on (right rectangle). The dashed line in A shows the piecewise linear SSIO approximation used in section 4.2, which suggests using the intersections of the linear pieces (black points) as the analog of the fold edges in part B
Incremental Training of First Order Recurrent Neural Networks to Predict a ContextSensitive Language
, 2003
"... In recent years it has been shown that first order recurrent neural networks trained by gradientdescent can learn not only regular but also simple contextfree and contextsensitive languages. However, the success rate was generally low and severe instability issues were encountered. The present st ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
In recent years it has been shown that first order recurrent neural networks trained by gradientdescent can learn not only regular but also simple contextfree and contextsensitive languages. However, the success rate was generally low and severe instability issues were encountered. The present study examines the hypothesis that a combination of evolutionary hill climbing with incremental learning and a wellbalanced training set enables first order recurrent networks to reliably learn contextfree and mildly contextsensitive languages. In particular, we trained the networks to predict symbols in string sequences of the contextsensitive language Preprint submitted to Neural Networks 10 January 2003 1}. Comparative experiments with and without incremental learning indicated that incremental learning can accelerate and facilitate training. Furthermore, incrementally trained networks generally resulted in monotonic trajectories in hidden unit activation space, while the trajectories of nonincrementally trained networks were oscillating. The nonincrementally trained networks were more likely to generalise.
Mathematical Aspects of Neural Networks
 European Symposium of Artificial Neural Networks 2003
, 2003
"... In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretic ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretical results (as of beginning of 2003) in the respective areas. Thereby, we follow the dichotomy offered by the overall network structure and restrict ourselves to feedforward networks, recurrent networks, and selforganizing neural systems, respectively.
Tutorial: Perspectives on Learning with RNNs
 in: Proc. ESANN, 2002
"... We present an overview of current lines of research on learning with recurrent neural networks (RNNs). Topics covered are: understanding and unification of algorithms, theoretical foundations, new efforts to circumvent gradient vanishing, new architectures, and fusion with other learning methods ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We present an overview of current lines of research on learning with recurrent neural networks (RNNs). Topics covered are: understanding and unification of algorithms, theoretical foundations, new efforts to circumvent gradient vanishing, new architectures, and fusion with other learning methods and dynamical systems theory. The structuring guideline is to understand many new approaches as different efforts to regularize and thereby improve recurrent learning.
Input Space Bifurcation Manifolds of RNNs
"... Abstract. We derive analytical expressions of local codim1bifurcations for a fully connected, additive, discretetime RNN, where we regard the external inputs as bifurcation parameters. The complexity of the bifurcation diagrams obtained increases exponentially with the number of neurons. We show ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. We derive analytical expressions of local codim1bifurcations for a fully connected, additive, discretetime RNN, where we regard the external inputs as bifurcation parameters. The complexity of the bifurcation diagrams obtained increases exponentially with the number of neurons. We show that a threeneuron cascaded network can serve as a universal oscillator, whose amplitude and frequency can be completely controlled by input parameters. 1
Stability of quasiperiodic orbit in Discrete Recurrent Neural Network
"... Abstract: A simple discrete recurrent neural network model is considered. The local stability is analyzed with the associated characteristic model. In order to study the quasiperiodic orbit dynamic behavior, it is necessary to determinate the NeimarkSacker bifurcation. In the case of two neurons, ..."
Abstract
 Add to MetaCart
Abstract: A simple discrete recurrent neural network model is considered. The local stability is analyzed with the associated characteristic model. In order to study the quasiperiodic orbit dynamic behavior, it is necessary to determinate the NeimarkSacker bifurcation. In the case of two neurons, one necessary condition that produces the NeimarkSacker bifurcation is found. In addition to this, the stability and direction of the NeimarkSacker are determined by applying the normal form theory and the center manifold theorem. An example is given and numerical simulation are performed to illustrate the obtained results. The phaselocking is analyzed given some experimental result of Arnold Tongue in determinate weight configuration.
Abstract Input Space Bifurcation Manifolds of Recurrent Neural Networks
"... We derive analytical expressions of local codimension1 bifurcations for a fully connected, additive, discretetime recurrent neural network (RNN), where we regard the external inputs as bifurcation parameters. The complexity of the bifurcation diagrams obtained increases exponentially with the numb ..."
Abstract
 Add to MetaCart
We derive analytical expressions of local codimension1 bifurcations for a fully connected, additive, discretetime recurrent neural network (RNN), where we regard the external inputs as bifurcation parameters. The complexity of the bifurcation diagrams obtained increases exponentially with the number of neurons. We show that a threeneuron cascaded network can serve as a universal oscillator, whose amplitude and frequency can be completely controlled by input parameters. Key words: bifurcation manifolds, input space, dynamics, recurrent neural network 1
Novel Recurrent Neural Network Weight Initialization Strategy
"... Abstract — This paper proposes a weight initialization strategy for a discretetime recurrent neural network model. It is based on analyzing the recurrent network as a nonlinear system, and choosing its initial weights to put this system in the boundaries between different dynamics, i.e., its bifurc ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — This paper proposes a weight initialization strategy for a discretetime recurrent neural network model. It is based on analyzing the recurrent network as a nonlinear system, and choosing its initial weights to put this system in the boundaries between different dynamics, i.e., its bifurcations. The relationship between the change in dynamics and training error evolution is studied. Two simple examples of the application of this strategy are shown: the detection of a 2pulse temporal pattern and the detection of a physiological signal, a feature of a visual evoked potential brain signal. Index Terms—DTRNN, nonlinear system, training, bifurcation. I.
Patterns, Memory and Periodicity in TwoNeuron Delayed Recurrent Inhibitory Loops
"... Abstract. We study the coexistence of multiple periodic solutions for an analogue of the integrateandfire neuron model of twoneuron recurrent inhibitory loops with delayed feedback, which incorporates the firing process and absolute refractory period. Upon receiving an excitatory signal from the ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. We study the coexistence of multiple periodic solutions for an analogue of the integrateandfire neuron model of twoneuron recurrent inhibitory loops with delayed feedback, which incorporates the firing process and absolute refractory period. Upon receiving an excitatory signal from the excitatory neuron, the inhibitory neuron emits a spike with a patternrelated delay, in addition to the synaptic delay. We present a theoretical framework to view the inhibitory signal from the inhibitory neuron as a selffeedback of the excitatory neuron with this additional delay. Our analysis shows that the inhibitory feedbacks with firing and the absolute refractory period can generate four basic types of oscillations, and the complicated interaction among these basic oscillations leads to a large class of periodic patterns and the occurrence of multistability in the recurrent inhibitory loop. We also introduce the average time of convergence to a periodic pattern to determine which periodic patterns have the potential to be used for neural information transmission and cognition processing in the nervous system.