Results 1  10
of
71
Dynamics of Sparsely Connected Networks of Excitatory and Inhibitory Spiking Neurons
, 1999
"... The dynamics of networks of sparsely connected excitatory and inhibitory integrateand re neurons is studied analytically. The analysis reveals a very rich repertoire of states, including: Synchronous states in which neurons re regularly; Asynchronous states with stationary global activity and very ..."
Abstract

Cited by 147 (11 self)
 Add to MetaCart
The dynamics of networks of sparsely connected excitatory and inhibitory integrateand re neurons is studied analytically. The analysis reveals a very rich repertoire of states, including: Synchronous states in which neurons re regularly; Asynchronous states with stationary global activity and very irregular individual cell activity; States in which the global activity oscillates but individual cells re irregularly, typically at rates lower than the global oscillation frequency. The network can switch between these states, provided the external frequency, or the balance between excitation and inhibition, is varied. Two types of network oscillations are observed: In the `fast' oscillatory state, the network frequency is almost fully controlled by the synaptic time scale. In the `slow' oscillatory state, the network frequency depends mostly on the membrane time constant. Finite size eects in the asynchronous state are also discussed.
Population Dynamics of Spiking Neurons: Fast Transients, Asynchronous States, and Locking
 NEURAL COMPUTATION
, 2000
"... An integral equation describing the time evolution of the population activity in a homogeneous pool of spiking neurons of the integrateandfire type is discussed. It is analytically shown that transients from a state of incoherent firing can be immediate. The stability of incoherent firing is analy ..."
Abstract

Cited by 134 (25 self)
 Add to MetaCart
An integral equation describing the time evolution of the population activity in a homogeneous pool of spiking neurons of the integrateandfire type is discussed. It is analytically shown that transients from a state of incoherent firing can be immediate. The stability of incoherent firing is analyzed in terms of the noise level and transmission delay and a bifurcation diagram is derived. The response of a population of noisy integrateandfire neurons to an input current of small amplitude is calculated and characterized by a linear filter L. The stability of perfectly synchronized `locked' solutions is analyzed.
Computational analysis of the role of the hippocampus in memory
 Hippocampus
, 1994
"... The authors draw together the results of a series of detailed computational studies and show how they are contributing to the development of a theory of hippocampal function. A new part of the theory introduced here is a quantitative analysis of how backprojections from the hippocampus to the neocor ..."
Abstract

Cited by 134 (14 self)
 Add to MetaCart
The authors draw together the results of a series of detailed computational studies and show how they are contributing to the development of a theory of hippocampal function. A new part of the theory introduced here is a quantitative analysis of how backprojections from the hippocampus to the neocortex could lead to the recall of recent memories. The theory is then compared with other theories of hippocampal function. First, what is computed by the hippocampus is considered. The hypothesis the authors advocate, on the basis of the effects of damage to the hippocampus and neuronal activity recorded in it, is that it is involved in the formation of new memories by acting as an intermediateterm buffer store for information about episodes, particularly for spatial, but probably also for some nonspatial, information. The authors analyze how the hippocampus could perform this function, by producing a computational theory of how it operates, based on neuroanatomical and neurophysiological information about the different neuronal systems contained within the hippocampus. Key hypotheses are that the CA3 pyramidal cells operate as a single autoassociation network to store new episodic information as it arrives via a number of specialized preprocessing stages from many association areas of the cerebral cortex, and that the dentate
Synaptic Basis of Cortical Persistent Activity: the Importance of NMDA Receptors to Working Memory
 J. Neurosci
, 1999
"... this paper I present a network model of spiking neurons in which synapses are endowed with realistic gating kinetics, based on experimentally measured dynamical properties of cortical synapses. I will focus on how delayperiod activity could be generated by neuronally plausible mechanisms; the issue ..."
Abstract

Cited by 103 (15 self)
 Add to MetaCart
this paper I present a network model of spiking neurons in which synapses are endowed with realistic gating kinetics, based on experimentally measured dynamical properties of cortical synapses. I will focus on how delayperiod activity could be generated by neuronally plausible mechanisms; the issue of memory field formation will be addressed in a separate study. A main problem to be investigated is that of "rate control" for a persistent state: if a robust persistent activity necessitates strong recurrent excitatory connections, how can the network be prevented from runaway excitation in spite of the powerful positive feedback, so that neuronal firing rates are low and comparable to those of PFC cells (10 50 Hz)? Moreover, a persistent state may be destabilized because of network dynamics. For example, fast recurrent excitation followed by a slower negative feedback may lead to network instability and a collapse of the persistent state. It is shown that persistent states at low firing rates are usually stable only in the presence of sufficiently slow excitatory synapses of the NMDA type. Functional implications of these results for the role of Received April 14, 1999; revised Aug. 12, 1999; accepted Aug. 12, 1999
Dynamics of learning and recall at excitatory recurrent synapses and cholinergic modulation in rat hippocampal region CA3
 J. Neurosci
, 1995
"... Hippocampal region CA3 contains strong recurrent excitation mediated by synapses of the longitudinal association fibers. These recurrent excitatory connections may play a dominant role in determining the information processing characteristics of this region. However, they result in feedback dynam ..."
Abstract

Cited by 83 (10 self)
 Add to MetaCart
Hippocampal region CA3 contains strong recurrent excitation mediated by synapses of the longitudinal association fibers. These recurrent excitatory connections may play a dominant role in determining the information processing characteristics of this region. However, they result in feedback dynamics that may cause both runaway excitatory activity and runaway synaptic modification. Previous models of recurrent excitation have prevented unbounded activity using biologically unrealistic techniques. Here, the activation of feedback inhibition is shown to prevent unbounded activity, allowing stable activity states during recall and learning. In the model, cholinergic suppression of synaptic transmission at excitatory feedback synapses is shown to determine the extent to which activity depends upon new features of the afferent input versus components
What determines the frequency of fast network oscillations with irregular neural discharges? I. Synaptic dynamics and excitationinhibition balance
 J NEUROPHYSIOL 90: 415–430, 2003
, 2003
"... When the local field potential of a cortical network displays coherent fast oscillations (~40Hz gamma or ~200Hz sharpwave ripples), the spike trains of constituent neurons are typically irregular and sparse. The dichotomy between rhythmic local field and stochastic spike trains presents a challe ..."
Abstract

Cited by 64 (4 self)
 Add to MetaCart
When the local field potential of a cortical network displays coherent fast oscillations (~40Hz gamma or ~200Hz sharpwave ripples), the spike trains of constituent neurons are typically irregular and sparse. The dichotomy between rhythmic local field and stochastic spike trains presents a challenge to the theory of brain rhythms in the framework of coupled oscillators. Previous studies have shown that when noise is large and recurrent inhibition is strong, a coherent network rhythm can be generated while single neurons fire intermittently at low rates compared to the frequency of the oscillation. However, these studies used too simplified synaptic kinetics to allow quantitative predictions of the population rhythmic frequency. Here we show how to derive quantitatively the coherent
A population density approach that facilitates largescale modeling of neural networks: Analysis and an application to orientation tuning
 J. Comp. Neurosci
, 2000
"... We explore a computationally efficient method of simulating realistic networks of neurons introduced by Knight, Manin, and Sirovich (1996) in which integrateandfire neurons are grouped into large populations of similar neurons. For each population, we form a probability density which represents th ..."
Abstract

Cited by 49 (1 self)
 Add to MetaCart
We explore a computationally efficient method of simulating realistic networks of neurons introduced by Knight, Manin, and Sirovich (1996) in which integrateandfire neurons are grouped into large populations of similar neurons. For each population, we form a probability density which represents the distribution of neurons over all possible states. The populations are coupled via stochastic synapses in which the conductance of a neuron is modulated according to the firing rates of its presynaptic populations. The evolution equation for each of these probability densities is a partial differentialintegral equation which we solve numerically. Results obtained for several example networks are tested against conventional computations for groups of individual neurons. We apply this approach to modeling orientation tuning in the visual cortex. Our population density model is based on the recurrent feedback model of a hypercolumn in cat visual cortex of Somers et al. (1995). We simulate the response to oriented flashed bars. As in the Somers model, a weak orientation bias provided by feedforward lateral geniculate input is transformed by intracortical circuitry into sharper orientation tuning which is independent of stimulus contrast. The population density approach appears to be a viable method for simulating large neural networks. Its computational efficiency overcomes some of the restrictions imposed by computation time in individual
Stationary Bumps in Networks of Spiking Neurons
"... Introduction Neuronal activity due to recurrent excitations in the form of a spatially localized pulse or bump has been proposed as a mechanism for feature selectivity in models of the visual system (Somers, Nelson, & Sur, 1995; Hansel & Sompolinsky, 1998), the head direction system (Skaggs, Kniera ..."
Abstract

Cited by 46 (15 self)
 Add to MetaCart
Introduction Neuronal activity due to recurrent excitations in the form of a spatially localized pulse or bump has been proposed as a mechanism for feature selectivity in models of the visual system (Somers, Nelson, & Sur, 1995; Hansel & Sompolinsky, 1998), the head direction system (Skaggs, Knieram, Kudrimoti, & McNaughton, 1995; Zhang, 1996; Redish, Elga, & Touretzky, 1996), and working memory (Wilson & Cowan, 1973; Amit & Brunel, 1997; Camperi & Wang, 1998). Many of the previous mathematical formulations of such structures have employedpopulation rate models (Wilson &Cowan, 1972, 1973; Amari, 1977; Kishimoto & Amari, 1979; Hansel & Sompolinsky, 1998). (See Ermentrout, 1998, for a recent review.) Here, we consider a network of spiking neurons that shows such structures and investigate their properties. In our network we #nd localized timestationary states
What Matters in Neuronal Locking?
"... Present and permanent address: PhysikDepartment der TU Munchen Exploiting local stability we show what neuronal characteristics are essential to ensure that coherent oscillations are asymptotically stable in a spatially homogeneous network of spiking neurons. Under standard conditions, a necessa ..."
Abstract

Cited by 46 (10 self)
 Add to MetaCart
Present and permanent address: PhysikDepartment der TU Munchen Exploiting local stability we show what neuronal characteristics are essential to ensure that coherent oscillations are asymptotically stable in a spatially homogeneous network of spiking neurons. Under standard conditions, a necessary and in the limit of a large number of interacting neighbors also sufficient condition is that the postsynaptic potential is increasing in time as the neurons fire. If the postsynaptic potential is decreasing, oscillations are bound to be unstable. This is a kind of locking theorem and boils down to a subtle interplay of axonal delays, postsynaptic potentials, and refractory behavior. The theorem also allows for mixtures of excitatory and inhibitory interactions. On the basis of the locking theorem we present a simple geometric method to verify existence and local stability of a coherent oscillation. 2 1
The Number of Synaptic Inputs and the Synchrony of Large Sparse Neuronal Networks
, 1999
"... The prevalence of coherent oscillations in various frequency ranges in the central nervous system raises the question of the mechanisms that synchronize large populations of neurons. We study synchronization in models of large networks of spiking neurons with random sparse connectivity. Synchrony oc ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
The prevalence of coherent oscillations in various frequency ranges in the central nervous system raises the question of the mechanisms that synchronize large populations of neurons. We study synchronization in models of large networks of spiking neurons with random sparse connectivity. Synchrony occurs only when the average number of synapses, M , that a cell receives is larger than a critical value, M c . Below M c , the system is in an asynchronous state. In the limit of weak coupling, assuming identical neurons, we reduce the model to a system of phase oscillators which are coupled via an effective interaction, \Gamma. In this framework, we develop an approximate theory for sparse networks of identical neurons to estimate M c analytically from the Fourier coefficients of \Gamma. Our approach relies on the assumption that the dynamics of a neuron depend mainly on the number of cells that are presynaptic to it. We apply this theory to compute M c for a model of inhibitory networks of integrateandfire (I&F) neurons as a function of the intrinsic neuronal properties (e.g., the refractory period T r ), the synaptic time constants and the strength of the external stimulus, I ext . The number M c is found to be nonmonotonous with the strength of I ext . For T r = 0, we estimate the minimum value of M c over all the parameters of the model to be 363:8. Above M c , the neurons tend to fire in: 1) smeared one cluster states at high firing rates and 2) smeared two or more cluster states at low firing rates. Refractoriness decreases M c at intermediate and high firing rates. These results are compared against numerical simulations. We show numerically that systems with different sizes, N , behave in the same way provided the connectivity, M , is such a way that 1=M eff = 1=...