## Are neuronal networks that vicious ? or only their models ? Neural computation (2007)

Citations: | 2 - 1 self |

### BibTeX

@MISC{Cessac07areneuronal,

author = {B. Cessac and T. Viéville},

title = {Are neuronal networks that vicious ? or only their models ? Neural computation},

year = {2007}

}

### OpenURL

### Abstract

We present a mathematical analysis of a network with integrate and fire neurons, taking into account the realistic fact that the spike time is only known within some finite precision. This leads us to propose a model where spikes are effective at times multiple of a characteristic time scale δ, where δ can be mathematically arbitrary small. We make a complete mathematical characterization of the model-dynamics for conductance based integrate and fire models. We obtain the following results. The asymptotic dynamics is composed by finitely many periodic orbits, whose number and period can be arbitrary large and diverge in a region of the parameters space, traditionally called the “edge of chaos”, a notion mathematically well defined in the present paper. Furthermore, except at the edge of chaos, there is a one-to-one correspondence between the membrane potential trajectories and the spikes raster plot. This shows that the neural code is entirely “in the spikes ” in this case. As a key tool, we introduce an order parameter, easy to compute numerically, and closely related to a natural notion of entropy providing a relevant characterization of the computational capabilities of the network. This allows us to compare the computational capabilities of leaky and integrate and fire models and conductance based models. The present study considers networks with constant input, and without time-dependent plasticity, but the framework has been designed for both extensions.

### Citations

1036 |
A quantitative description of membrane current and its application to conduction and excitation in nerve
- Hodgkin, Huxley
(Show Context)
Citation Context ...f neuron k. Without loss of generality we normalize the quantities and fix C = 1. In its most general form, the neuron k’s membrane conductance gk > 0 depends on Vk (see e.g. Hodgkin-Huxley equations =-=[18]-=-) and time t, while the current ik can also depend on V, the membrane potential vector, and time t. The current ik can include various phenomenological terms. Note that (2) deals with neurons consider... |

874 |
Hasselblatt: Introduction to the Modern Theory
- Katok, B
- 1997
(Show Context)
Citation Context ... symbolic sequences are easier to handle than continuous variables, in many aspects such as the computation of topological or measure theoretic quantities like topological or Kolmogorov-Sinai entropy =-=[21]-=-. A natural related question is whether there is a one-to-one correspondence between the membrane potential trajectory and the raster plot (see theorem 2). Note that in the deterministic models that w... |

356 |
Metabolic stability and epigenesis in randomly constructed genetic nets
- Kauffman
- 1969
(Show Context)
Citation Context ...is neither chaotic nor ordered but somewhere in-between order and chaos. This has led to the idea of computation at the edge of chaos. Early evidence for this hypothesis has been reported by Kauffman =-=[22]-=- and Langton [27] considering cellular automata behavior, and Packard [32] using a genetic algorithm. See [1] for a review. In relation, with these works, theoretical results by Derrida and co-authors... |

297 |
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
- Dayan, Abbott
- 2001
(Show Context)
Citation Context ...euronal networks to keep essential biological features, but also sufficiently simplified to achieve a characterization of their dynamics, the most often numerically and, when possible, mathematically =-=[17, 9]-=-. There is always a delicate compromise between closeness to biology and model simplifications. At one extreme, one reproduces all known features of ionic channels, neurons, synapses and lose the hope... |

258 |
Spiking neuron models
- Gerstner, Kistler
- 2002
(Show Context)
Citation Context ...euronal networks to keep essential biological features, but also sufficiently simplified to achieve a characterization of their dynamics, the most often numerically and, when possible, mathematically =-=[17, 9]-=-. There is always a delicate compromise between closeness to biology and model simplifications. At one extreme, one reproduces all known features of ionic channels, neurons, synapses and lose the hope... |

233 |
Biophysics of Computation: Information Processing in Single Neurons
- Koch
- 1999
(Show Context)
Citation Context ... a hard question but we would like to mention that, in the cortex, neurons are often in a high conductance state and firing with a rate higher than 1kHz. Thus, the firing time is not larger than 1 mS =-=[23]-=-. This could imply a high probability of being close to the “threshold” since a neuron that fires often as more chances to be close to the threshold. The corresponding situation in our model could be ... |

201 |
Spin Glass Theory and Beyond. World Scientific: Singapore 1987; p xiii
- Mezard, Parisi, et al.
(Show Context)
Citation Context ...valent to the Kauffman’s cellular automaton [22]. It has been shown by Derrida and coworkers [11, 12] that the Kauffman’s model has a structure similar to the Sherrington-Kirckpatrick spin-glass model=-=[31]-=-. The situation is even more complex when γ �= 0. It is likely that we have in fact a situation very similar to discrete time neural networks with firing rates where a similar analogy has been exhibit... |

142 |
Life at the Edge of Chaos
- Langton
- 1992
(Show Context)
Citation Context ...orizon, and to the neural network capability of producing distinct spikes trains. In other words, this is a way to measure the 2sability of the system to emulate different input-output functions. See =-=[27, 1]-=- for a discussion on the link between the system dynamics and its related computational complexity 1 . The smaller d(Ω, S), the larger is the set of distinct spikes trains that the neural network is a... |

105 |
Which Model to Use for Cortical Spiking Neurons
- Izhikevich
(Show Context)
Citation Context ... powerful generalized Integrate and Fire model (gIF) where time is discretized. This model was quite efficient with respect to its algorithmic complexity and ability to reproduce spike responses (see =-=[20]-=-) for individual neurons. In the present paper we pursue this discussion at the level of network. Thanks to a rigorous analysis on the time discretization scales involved, we propose a discrete time i... |

78 |
The high-conductance state of neocortical neurons in vivo. Nat Rev Neurosci
- Destexhe, Rudolph, et al.
- 2003
(Show Context)
Citation Context ...-waving argument nevertheless suggests that a close relationship between the so-called “high-conductance” states and high effective entropy could exist. The relation between “high-conductance” states =-=[13]-=- and the presence of ghost orbit as defined in (2) is an open issue: on one hand, it is observed that neurons in high-conductance state have their potential close to the threshold, thus always subject... |

74 |
and Y.Pomeau. Random networks of automata: A simple annealed approximation
- Derrida
- 1986
(Show Context)
Citation Context ...and Langton [27] considering cellular automata behavior, and Packard [32] using a genetic algorithm. See [1] for a review. In relation, with these works, theoretical results by Derrida and co-authors =-=[12, 11]-=- allow to characterize analytically the dynamics of random Boolean networks and for networks of threshold elements [10]. Recently [1] have contributed to this question, considering numerically experim... |

60 | Prevalence: a translationinvariant “almost every” on infinite-dimensional spaces
- Hunt, Sauer, et al.
- 1992
(Show Context)
Citation Context ...f the statistical parameters to be characteristic of the region of G,i (ext) that the probabilities PG, P i (ext) weight (more precisely, one expects to observe a “prevalent” behavior in the sense of =-=[19]-=-). Imposing such a probability distribution has several consequences. First, the membrane potentials become random variables whose law is induced by the distribution PGP i (ext). But, this has another... |

59 |
Recherches quantitatives sur l’excitation electrique des nerfs traitee comme une polarisation
- Lapicque
- 1907
(Show Context)
Citation Context ...ll M = [Vmin, Vmax] N . This is the phase space of our dynamical system. We are interested in a specific class of dynamical systems called “Integrate and Fire models” in the neural networks community =-=[28]-=-. 1.1 General structure of Integrate and Fire models. Integrate and Fire models always incorporate two regimes for the neuron membrane potential evolution: the “integrate” regime and the ”fire” regime... |

55 | Simulation of networks of spiking neurons: A review of tools and strategies
- Brette, Rudolph, et al.
(Show Context)
Citation Context ... to a mean-field model where γ(t, [˜ω] t ) is replaced by its average. This corresponds to what is called “current based” synapses instead of “conductance based” synapses in the literature, (see e.g. =-=[2]-=-). 3. Model III (20) approximation with a fixed γ and simplified synapses. The evolution equation of membrane potentials is given by: � ± Vk(t + 1) = 〈γ〉Vk(t)[1 − Z(Vk(t))] + E j G ± ij Z(Vj(t − δ ± )... |

42 |
Real-time computation at the edge of chaos in recurrent neural networks
- Bertschinger, Natschläger
- 2004
(Show Context)
Citation Context ...orizon, and to the neural network capability of producing distinct spikes trains. In other words, this is a way to measure the 2sability of the system to emulate different input-output functions. See =-=[27, 1]-=- for a discussion on the link between the system dynamics and its related computational complexity 1 . The smaller d(Ω, S), the larger is the set of distinct spikes trains that the neural network is a... |

28 |
A discrete time neural network model with spiking neurons II. Dynamics with noise., in "Journal
- CESSAC
(Show Context)
Citation Context ...t + 1) = γ Vk(t) [1 − Z(Vk(t))] + N� j=1 Wkj Z(Vj(t)) + i (ext) k This model has been proposed by Beslon-Mazet-Soula Model in [39]. A mathematical analysis of its asymptotic dynamics has been done in =-=[5]-=- and we extend these results in the present paper. Note that having constant conductances leads to a dynamics which is independent of the past firing times (raster plot). In fact, the dynamical is ess... |

19 |
Increase in complexity in random neural networks
- Cessac
- 1995
(Show Context)
Citation Context ...he situation is even more complex when γ �= 0. It is likely that we have in fact a situation very similar to discrete time neural networks with firing rates where a similar analogy has been exhibited =-=[3, 4]-=-. 22sFigure 4: Average current I for the models I (top left)-II (top right)- III (bottom left)- IV (bottom right) with σ ∈ [0.01, 1], τL ∈ [10, 40]ms. the dynamics; for σ = 0.05, where d(Ω, S) is “lar... |

16 | Spontaneous dynamics of asymmetric random recurrent spiking neural networks
- Soula, Beslon, et al.
(Show Context)
Citation Context ...te on alternative II though mathematical results can be extended to alternative I in a straightforward way. This corresponds to the initial choice of the Beslon-Mazet-Soula Model motivating the paper =-=[39]-=- and the present work. In this case, the reset corresponds to : Vk(t) = θ ⇒ Vk(t + δ) = Jk(t, [˜ω] t ), (19) (recall that Vreset = 0). Integrate and Fire regime can now be included in a unique equatio... |

15 |
Transition to topological chaos for circle maps
- MacKay, Tresser
- 1986
(Show Context)
Citation Context ...s where the entropy is positive (see [25, 26] and discussion below). The set of points where the entropy is positive can have a fractal structure even in the simplest examples of one dimensional maps =-=[30, 16]-=-. Therefore, there is no hope to characterize E rigorously in a next future. Instead, we shall use below a numerical characterization. The edge of chaos is a small set in the BMS model, and the same c... |

13 | From neuron to neural networks dynamics
- Cessac, Samuelides
(Show Context)
Citation Context ...in the asymptotic (e.g. how it acts on d(Ω, S)). This question can be addressed by combining dynamical system approaches, probabilistic methods and mean-field approaches from statistical physics (see =-=[6, 35]-=- for an example of such combination applied to neural networks). A detailed description of this aspect will be developed in a separate work [7]. In the present paper we restrict to the following numer... |

12 | Multivalley Structure in Kauffman’s Model: Analogy with Spin Glasses, J.Phys.A: Math.Gen - Derrida, Flyvbjerg - 1986 |

12 |
editors. Methods in Neuronal Modelling. From Ions to Networks. Computational Neuroscience
- Koch, Segev
- 1998
(Show Context)
Citation Context ...preparations are less submitted to these restrictions, but it is still difficult to design specific neuronal structure in order to investigate the role of such systems regarding information treatment =-=[24]-=-. In this context one is often lead to propose models sufficiently close from neuronal networks to keep essential biological features, but also sufficiently simplified to achieve a characterization of... |

11 |
Analytical integrate and fire neuron models with conductance-based dynamics for event driven simulation strategies
- Rudolph, Destexhe
(Show Context)
Citation Context ...40], in Integrate and Fire models conductance and currents depend on V only via the previous firing times of the neurons. This corresponds to the so-called conductance based Integrate and Fire models =-=[34]-=-. Namely, conductances (and currents) have the general form, � gk ≡ gk(t, t (n) j � times of all neurons up to time t. t ) where t(n) j is the n-th firing time of neuron j and Let us give two examples... |

11 |
A Mathematical Analysis of the Effects of Hebbian Learning
- SIRI, BERRY, et al.
- 2008
(Show Context)
Citation Context ...argest learning capacities when this entropy is large. This aspect will be developed in a separated paper. (For the effect of Hebbian learning and entropy reduction in firing rate neural networks see =-=[37]-=-). Finally, a positive effective entropy means that the system essentially behaves like a chaotic system during the time of the experiment. Indeed, the entropy is closely related to the distance d(Ω, ... |

10 |
1987a], Dynamical Phase Transitions in Non-Symmetric Spin Glasses
- Derrida
(Show Context)
Citation Context ...In relation, with these works, theoretical results by Derrida and co-authors [12, 11] allow to characterize analytically the dynamics of random Boolean networks and for networks of threshold elements =-=[10]-=-. Recently [1] have contributed to this question, considering numerically experiment in the context of real-time computation with recurrent neural networks. 3sFix a real number θ ∈ [Vmin, Vmax] called... |

10 |
Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons., in "Journal of Physiology-Paris
- SIRI, BERRY, et al.
- 2007
(Show Context)
Citation Context ... highly correlated, via synaptic plasticity mechanism. What will be the effect of e.g. STPD or Hebbian learning on the effective entropy is a perspective for a future work. Recent results in [38] and =-=[37, 36]-=- suggest that synaptic plasticity reduces the entropy by diminishing the variability of raster plots and increasing the robustness of the response to an input. Some general (variational) mechanism cou... |

8 |
Exploring the Neural Code. The M.I.T
- Spikes
- 1996
(Show Context)
Citation Context ... 06000 Nice, France. § INRIA, 2004 Route des Lucioles, 06902 Sophia-Antipolis, France. 1sNeuronal networks have the capacity to treat incoming information, performing complex computational tasks (see =-=[33]-=- for a deep review), including sensory-motor tasks. It is a crucial challenge to understand how this information is encoded and transformed. However, when considering in vivo neuronal networks, one is... |

7 |
Occurrence of chaos and at line in random neural networks
- Cessac
- 1994
(Show Context)
Citation Context ...he situation is even more complex when γ �= 0. It is likely that we have in fact a situation very similar to discrete time neural networks with firing rates where a similar analogy has been exhibited =-=[3, 4]-=-. 22sFigure 4: Average current I for the models I (top left)-II (top right)- III (bottom left)- IV (bottom right) with σ ∈ [0.01, 1], τL ∈ [10, 40]ms. the dynamics; for σ = 0.05, where d(Ω, S) is “lar... |

7 |
A piece-wise affine contracting map with positive entropy, Discrete Contin
- Kruglikov, Rypdal
(Show Context)
Citation Context ...in the simplest examples (e.g., the BMS model with Laplacian couplings [8]). There are good reasons to believe that this definition coincides with the set of points where the entropy is positive (see =-=[25, 26]-=- and discussion below). The set of points where the entropy is positive can have a fractal structure even in the simplest examples of one dimensional maps [30, 16]. Therefore, there is no hope to char... |

7 |
Random recurrent neural networks
- Samuelides, Cessac
(Show Context)
Citation Context ...in the asymptotic (e.g. how it acts on d(Ω, S)). This question can be addressed by combining dynamical system approaches, probabilistic methods and mean-field approaches from statistical physics (see =-=[6, 35]-=- for an example of such combination applied to neural networks). A detailed description of this aspect will be developed in a separate work [7]. In the present paper we restrict to the following numer... |

6 |
Dynamic Patterns in Complex Systems, chapter Adaptation toward the edge of chaos
- Packard
- 1988
(Show Context)
Citation Context ...his has led to the idea of computation at the edge of chaos. Early evidence for this hypothesis has been reported by Kauffman [22] and Langton [27] considering cellular automata behavior, and Packard =-=[32]-=- using a genetic algorithm. See [1] for a review. In relation, with these works, theoretical results by Derrida and co-authors [12, 11] allow to characterize analytically the dynamics of random Boolea... |

6 |
2005, ‘Dynamique et plasticité dans les réseaux de neurones à impulsions
- Soula
(Show Context)
Citation Context ... reality, highly correlated, via synaptic plasticity mechanism. What will be the effect of e.g. STPD or Hebbian learning on the effective entropy is a perspective for a future work. Recent results in =-=[38]-=- and [37, 36] suggest that synaptic plasticity reduces the entropy by diminishing the variability of raster plots and increasing the robustness of the response to an input. Some general (variational) ... |

5 |
From synaptic rumours to low-level perception: an intracellular view of visual cortical dynamics
- Frégnac
- 2004
(Show Context)
Citation Context ...lved. Also, in vivo neuronal systems are not isolated objects and they are submitted to strong interactions coming from the action of the external world, that hinder the study of a specific mechanism =-=[15]-=-. In vitro preparations are less submitted to these restrictions, but it is still difficult to design specific neuronal structure in order to investigate the role of such systems regarding information... |

5 |
Entropy via multiplicity, Discrete Contin
- Kruglikov, Rypdal
(Show Context)
Citation Context ...in the simplest examples (e.g., the BMS model with Laplacian couplings [8]). There are good reasons to believe that this definition coincides with the set of points where the entropy is positive (see =-=[25, 26]-=- and discussion below). The set of points where the entropy is positive can have a fractal structure even in the simplest examples of one dimensional maps [30, 16]. Therefore, there is no hope to char... |

3 |
Le chaos, théorie et expériences
- Gambaudo, Tresser
- 1988
(Show Context)
Citation Context ...s where the entropy is positive (see [25, 26] and discussion below). The set of points where the entropy is positive can have a fractal structure even in the simplest examples of one dimensional maps =-=[30, 16]-=-. Therefore, there is no hope to characterize E rigorously in a next future. Instead, we shall use below a numerical characterization. The edge of chaos is a small set in the BMS model, and the same c... |

2 | Are biological neuron that vicious ? or only their models ? Neural computation
- Viéville, Cessac
- 2007
(Show Context)
Citation Context ...mplified models can lose important biological features. Besides, sharp simplifications may reveal exotic properties which are in fact induced by the model itself, but do not exist in the real systems =-=[40]-=-. This last aspect is particularly crucial in theoretical neuroscience. Having obtained a “reasonable” model whose dynamics can be well characterized it is always tempting to extrapolate to biological... |

1 |
Edge of chaos in integrate and fire models: a mean field approach
- Cessac, Touboul
- 2007
(Show Context)
Citation Context ...periodic with typical periods compatible with simulation times. This manifold can be characterized in the case where the synaptic weights are independent, identically distributed with a variance σ2 N =-=[7]-=-. In BMS model (e.g., time discretized gIF model with constant conductances) it can be proved that the chaotic situation is non generic [5]. We now develop the same lines of investigation as in [5] an... |

1 |
Edge of chaos in a simple neural network
- Cessac, Vasquez
- 2007
(Show Context)
Citation Context ...et of points E in the parameter space H where d(Ω, S) = 0. The topological structure of E can be quite complicated as we checked in the simplest examples (e.g., the BMS model with Laplacian couplings =-=[8]-=-). There are good reasons to believe that this definition coincides with the set of points where the entropy is positive (see [25, 26] and discussion below). The set of points where the entropy is pos... |

1 |
Ergodic Theory of Chaos and Stange Attractors
- Eckmann, Ruelle
- 1985
(Show Context)
Citation Context ...n of A is the open set B = � t≥0 Ft (U). The notion of attracting set is very close to the notion of attractor (though the definition of attractor requires an additional property of indecomposability =-=[14]-=-). This is therefore somehow a natural notion since this is the set on which trajectories are attracted and where the asymptotic dynamics lives, . . . in the “good cases”. Indeed, A may be empty. A mo... |