Results 1 - 10
of
303
Synaptic Basis of Cortical Persistent Activity: the Importance of NMDA Receptors to Working Memory
- J. Neurosci
, 1999
"... this paper I present a network model of spiking neurons in which synapses are endowed with realistic gating kinetics, based on experimentally measured dynamical properties of cortical synapses. I will focus on how delay-period activity could be generated by neuronally plausible mechanisms; the issue ..."
Abstract
-
Cited by 149 (18 self)
- Add to MetaCart
(Show Context)
this paper I present a network model of spiking neurons in which synapses are endowed with realistic gating kinetics, based on experimentally measured dynamical properties of cortical synapses. I will focus on how delay-period activity could be generated by neuronally plausible mechanisms; the issue of memory field formation will be addressed in a separate study. A main problem to be investigated is that of "rate control" for a persistent state: if a robust persistent activity necessitates strong recurrent excitatory connections, how can the network be prevented from runaway excitation in spite of the powerful positive feedback, so that neuronal firing rates are low and comparable to those of PFC cells (10 --50 Hz)? Moreover, a persistent state may be destabilized because of network dynamics. For example, fast recurrent excitation followed by a slower negative feedback may lead to network instability and a collapse of the persistent state. It is shown that persistent states at low firing rates are usually stable only in the presence of sufficiently slow excitatory synapses of the NMDA type. Functional implications of these results for the role of Received April 14, 1999; revised Aug. 12, 1999; accepted Aug. 12, 1999
What determines the frequency of fast network oscillations with irregular neural discharges? I. Synaptic dynamics and excitation-inhibition balance
- J NEUROPHYSIOL 90: 415–430, 2003
, 2003
"... When the local field potential of a cortical network displays coherent fast oscillations (~40-Hz gamma or ~200-Hz sharp-wave ripples), the spike trains of constituent neurons are typically irregular and sparse. The dichotomy between rhythmic local field and stochastic spike trains presents a challe ..."
Abstract
-
Cited by 130 (7 self)
- Add to MetaCart
When the local field potential of a cortical network displays coherent fast oscillations (~40-Hz gamma or ~200-Hz sharp-wave ripples), the spike trains of constituent neurons are typically irregular and sparse. The dichotomy between rhythmic local field and stochastic spike trains presents a challenge to the theory of brain rhythms in the framework of coupled oscillators. Previous studies have shown that when noise is large and recurrent inhibition is strong, a coherent network rhythm can be generated while single neurons fire intermittently at low rates compared to the frequency of the oscillation. However, these studies used too simplified synaptic kinetics to allow quantitative predictions of the population rhythmic frequency. Here we show how to derive quantitatively the coherent
Synchronization in networks of excitatory and inhibitory neurons with sparse, random connectivity
- Neural Computation
, 2003
"... In model networks of E-cells and I-cells (excitatory and inhibitory neurons) , synchronous rhythmic spiking often comes about from the interplay between the two cell groups: the E-cells synchronize the I-cells and vice versa. Under ideal conditions --- homogeneity in relevant network parameters, ..."
Abstract
-
Cited by 75 (9 self)
- Add to MetaCart
(Show Context)
In model networks of E-cells and I-cells (excitatory and inhibitory neurons) , synchronous rhythmic spiking often comes about from the interplay between the two cell groups: the E-cells synchronize the I-cells and vice versa. Under ideal conditions --- homogeneity in relevant network parameters, and all-to-all connectivity for instance --- this mechanism can yield perfect synchronization.
Neocortical pyramidal cells respond as integrate-and-fire neurons to in vivo-like input currents
, 2003
"... ..."
Advancing the Boundaries of High-Connectivity Network Simulation with Distributed Computing
, 2005
"... The availability of efficient and reliable simulation tools is one of the mission-critical technologies in the fast-moving field of computational neuroscience. Research indicates that higher brain functions emerge from large and complex cortical networks and their interactions. The large number of e ..."
Abstract
-
Cited by 63 (23 self)
- Add to MetaCart
The availability of efficient and reliable simulation tools is one of the mission-critical technologies in the fast-moving field of computational neuroscience. Research indicates that higher brain functions emerge from large and complex cortical networks and their interactions. The large number of elements (neurons) combined with the high connectivity (synapses) of the biological network and the specific type of interactions impose severe constraints on the explorable system size that previously have been hard to overcome. Here we present a collection of new techniques combined to a coherent simulation tool removing the fundamental obstacle in the computational study of biological neural networks: the enormous number of synaptic contacts per neuron. Distributing an individual simulation over multiple computers enables the investigation of networks orders of magnitude larger than previously possible. The
Firing Rate of the Noisy Quadratic Integrate-and-Fire Neuron
, 2003
"... We calculate the firing rate of the quadratic integrate-and-fire neuron in response to a colored noise input current. Such an input current is a good approximation to the noise due to the random bombardment of spikes, with the correlation time of the noise corresponding to the decay time of the syna ..."
Abstract
-
Cited by 48 (4 self)
- Add to MetaCart
We calculate the firing rate of the quadratic integrate-and-fire neuron in response to a colored noise input current. Such an input current is a good approximation to the noise due to the random bombardment of spikes, with the correlation time of the noise corresponding to the decay time of the synapses. The key parameter that determines the firing rate is the ratio of the correlation time of the colored noise, ¿s, to the neuronal time constant, ¿m. We calculate the firing rate exactly in two limits: when the ratio, ¿s=¿m, goes to zero (white noise) and when it goes to infinity. The correction to the short correlation time limit is O.¿s=¿m/, which is qualitatively different from that of the leaky integrate-and-fire neuron, where the correction is O. p ¿s=¿m/. The difference is due to the different boundary conditions of the probability density function of the membrane potential of the neuron at firing threshold. The correction to the long correlation time limit is O.¿m=¿s/. By combining the short and long correlation time limits, we derive an expression that provides a good approximation to the firing rate over the whole range of ¿s=¿m in the suprathreshold regime— that is, in a regime in which the average current is sufficient to make the cell fire. In the subthreshold regime, the expression breaks down somewhat when ¿s becomes large compared to ¿m.
Spike-Frequency Adaptation of a Generalized Leaky Integrate-and-Fire Model Neuron
- JOURNAL OF COMPUTATIONAL NEUROSCIENCE
, 2001
"... Although spike-frequency adaptation is a commonly observed property of neurons, its functional implications are still poorly understood. In this work, using a leaky integrate-and-fire neural model that includes a -activated K + current (I AHP ), we develop a quantitative theory of adaptation tempo ..."
Abstract
-
Cited by 43 (2 self)
- Add to MetaCart
Although spike-frequency adaptation is a commonly observed property of neurons, its functional implications are still poorly understood. In this work, using a leaky integrate-and-fire neural model that includes a -activated K + current (I AHP ), we develop a quantitative theory of adaptation temporal dynamics and compare our results with recent in vivo intracellular recordings from pyramidal cells in the cat visual cortex. Experimentally testable relations between the degree and the time constant of spike-frequency adaptation are predicted. We also contrast the I AHP model with an alternative adaptation model based on a dynamical firing threshold. Possible roles of adaptation in temporal computation are explored, as a a time-delayed neuronal self-inhibition mechanism. Our results include the following: (1) given the same firing rate, the variability of interspike intervals (ISIs) is either reduced or enhanced by adaptation, depending on whether the I AHP dynamics is fast or slow compared with the mean ISI in the output spike train; (2) when the inputs are Poisson-distributed (uncorrelated), adaptation generates temporal anticorrelation between ISIs, we suggest that measurement of this negative correlation provides a probe to assess the strength of I AHP in vivo; (3) the forward masking effect produced by the slow dynamics of I AHP is nonlinear and effective at selecting the strongest input among competing sources of input signals.
Effects of noisy drive on rhythms in networks of excitatory and inhibitory neurons
- Neural Comp
, 2005
"... Abstract. Synchronous rhythmic spiking in neuronal networks can be brought about by the interaction between E-cells and I-cells (excitatory and inhibitory cells): The I-cells gate and synchronize the E-cells, and the E-cells drive and synchronize the I-cells. We refer to rhythms generated in this wa ..."
Abstract
-
Cited by 33 (4 self)
- Add to MetaCart
Abstract. Synchronous rhythmic spiking in neuronal networks can be brought about by the interaction between E-cells and I-cells (excitatory and inhibitory cells): The I-cells gate and synchronize the E-cells, and the E-cells drive and synchronize the I-cells. We refer to rhythms generated in this way as “PING ” (Pyramidal-Interneuronal Gamma) rhythms. The PING mechanism requires that the drive II to the I-cells be sufficiently low; the rhythm is lost when II gets too large. This can happen in (at least) two different ways. In the first mechanism, the I-cells spike in synchrony, but get ahead of the E-cells, spiking without being prompted by the E-cells. We call this phase walkthrough of the I-cells. In the second mechanism, the I-cells fail to synchronize, and their activity leads to complete suppression of the E-cells. Noisy spiking in the E-cells, generated by noisy external drive, adds excitatory drive to the I-cells and may lead to phase walkthrough. Noisy spiking in the I-cells adds inhibition to the E-cells, and may lead to suppression of the E-cells. An analysis of the conditions under which noise leads to phase walkthrough of the I-cells or suppression of the E-cells shows that PING rhythms at frequencies far below the gamma range are robust to noise only if network parameter values are tuned very carefully. Together with an argument explaining why the PING mechanism
Towards reproducible descriptions of neuronal network models
- PLoS Comput Biol
, 2009
"... Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal ..."
Abstract
-
Cited by 32 (5 self)
- Add to MetaCart
(Show Context)
Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come.