Results 1  10
of
89
Synaptic Basis of Cortical Persistent Activity: the Importance of NMDA Receptors to Working Memory
 J. Neurosci
, 1999
"... this paper I present a network model of spiking neurons in which synapses are endowed with realistic gating kinetics, based on experimentally measured dynamical properties of cortical synapses. I will focus on how delayperiod activity could be generated by neuronally plausible mechanisms; the issue ..."
Abstract

Cited by 103 (15 self)
 Add to MetaCart
this paper I present a network model of spiking neurons in which synapses are endowed with realistic gating kinetics, based on experimentally measured dynamical properties of cortical synapses. I will focus on how delayperiod activity could be generated by neuronally plausible mechanisms; the issue of memory field formation will be addressed in a separate study. A main problem to be investigated is that of "rate control" for a persistent state: if a robust persistent activity necessitates strong recurrent excitatory connections, how can the network be prevented from runaway excitation in spite of the powerful positive feedback, so that neuronal firing rates are low and comparable to those of PFC cells (10 50 Hz)? Moreover, a persistent state may be destabilized because of network dynamics. For example, fast recurrent excitation followed by a slower negative feedback may lead to network instability and a collapse of the persistent state. It is shown that persistent states at low firing rates are usually stable only in the presence of sufficiently slow excitatory synapses of the NMDA type. Functional implications of these results for the role of Received April 14, 1999; revised Aug. 12, 1999; accepted Aug. 12, 1999
Interpreting neuronal population activity by reconstruction: unified framework with application to hippocampal place cells
 J. Neumphysiol
, 1998
"... such as the orientation of a line in the visual field or the location of Two main goals for reconstruction are approached in this the body in space are coded as activity levels in populations of neurons. Reconstruction or decoding is an inverse problem in which paper. The first goal is technical and ..."
Abstract

Cited by 75 (6 self)
 Add to MetaCart
such as the orientation of a line in the visual field or the location of Two main goals for reconstruction are approached in this the body in space are coded as activity levels in populations of neurons. Reconstruction or decoding is an inverse problem in which paper. The first goal is technical and is exemplified by the the physical variables are estimated from observed neural activity. population vector method applied to motor cortical activities Reconstruction is useful first in quantifying how much information during various reaching tasks (Georgopoulos et al. 1986, 1989; about the physical variables is present in the population and, second, Schwartz 1994) and the template matching method applied to in providing insight into how the brain might use distributed represen disparity selective cells in the visual cortex (Lehky and Sejnowtations in solving related computational problems such as visual ob ski 1990) and hippocampal place cells during rapid learning of ject recognition and spatial navigation. Two classes of reconstruction place fields in a novel environment (Wilson and McNaughton methods, namely, probabilistic or Bayesian methods and basis func 1993). In these examples, reconstruction extracts information tion methods, are discussed. They include important existing methods from noisy neuronal population activity and transforms it to a
Collective Behavior of Networks with Linear (VLSI) IntegrateandFire Neurons
, 1999
"... Introduction The integrateandfire (IF) neuron has become popular as a simplified neural element in modeling the dynamics of largescale networks of spiking neurons. A simple version of an IF neuron integrates the input current as an RC circuit (with a leakage current proportional to the depolariz ..."
Abstract

Cited by 60 (19 self)
 Add to MetaCart
Introduction The integrateandfire (IF) neuron has become popular as a simplified neural element in modeling the dynamics of largescale networks of spiking neurons. A simple version of an IF neuron integrates the input current as an RC circuit (with a leakage current proportional to the depolarization) and emits a spike when the depolarization crosses a threshold. We will refer to it as the RC neuron. Networks of neurons schematized in this way exhibit a wide variety of characteristics observed in single and multiple neuron recordings in cortex in vivo. With biologically plausible time constants and synaptic efficacies, they can maintain spontaneous activity, and when the network is subjected to Hebbian learning (subsets of cells are repeatedly activated by the external stimuli), it shows many stable states of activation, each corresponding to a different attractor of the network dynamics, in coexistence with spontaneous activity (Amit & Brunel, 1997a). These s
Simulation of networks of spiking neurons: A review of tools and strategies
 Journal of Computational Neuroscience
, 2007
"... We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on ..."
Abstract

Cited by 54 (23 self)
 Add to MetaCart
We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including HodgkinHuxley type, integrateandfire models, interacting with currentbased or conductancebased synapses, using clockdriven or eventdriven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given
Lower Bounds for the Computational Power of Networks of Spiking Neurons
 Neural Computation
, 1995
"... We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network o ..."
Abstract

Cited by 53 (11 self)
 Add to MetaCart
We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. We construct networks of spiking neurons that simulate arbitrary threshold circuits, Turing machines, and a certain type of random access machines with real valued inputs. We also show that relatively weak basic assumptions about the response and thresholdfunctions of the spiking neurons are sufficient in order to employ them for such computations. 1 Introduction and Basic Definitions There exists substantial evidence that timing phenomena such as temporal differences between spikes and frequencies of oscillating subsystems are integral parts of various information processing mechanisms in biological neural systems (for a survey and references see e.g. Kandel et al., ...
SpikeDriven Synaptic Plasticity: Theory, Simulation, VLSI Implementation
, 2000
"... e tests of the electronic device cover the range from spontaneous activity (34 Hz) to stimulusdriven rates (50 Hz). Low transition probabilities can be maintained in all ranges, even though the intrinsic time constants of the device are short (# 100 ms). Synaptic transitions are triggered by ele ..."
Abstract

Cited by 49 (13 self)
 Add to MetaCart
e tests of the electronic device cover the range from spontaneous activity (34 Hz) to stimulusdriven rates (50 Hz). Low transition probabilities can be maintained in all ranges, even though the intrinsic time constants of the device are short (# 100 ms). Synaptic transitions are triggered by elevated presynaptic rates: for low presynaptic rates, there are essentially no transitions. The synaptic device can preserve its memory for years in the absence of stimulation. Stochasticity of learning is a result of the variability of interspike intervals; noise is a feature of the distributed dynamics of the network. The fact Neural Computation 12, 22272258 (2000) c # 2000 Massachusetts Institute of Technology 2228 Fusi, Annunziato, Badoni, Salamon, and Amit that the synapse is binary on long timescales solves the stability problem of synaptic efficacies in the absence of stimulation. Yet stochastic learning theory
Noise in IntegrateandFire Neurons: From Stochastic Input to Escape Rates
 TO APPEAR IN NEURAL COMPUTATION.
, 1999
"... We analyze the effect of noise in integrateandfire neurons driven by timedependent input, and compare the diffusion approximation for the membrane potential to escape noise. It is shown that for timedependent subthreshold input, diffusive noise can be replaced by escape noise with a hazard funct ..."
Abstract

Cited by 41 (6 self)
 Add to MetaCart
We analyze the effect of noise in integrateandfire neurons driven by timedependent input, and compare the diffusion approximation for the membrane potential to escape noise. It is shown that for timedependent subthreshold input, diffusive noise can be replaced by escape noise with a hazard function that has a Gaussian dependence upon the distance between the (noisefree) membrane voltage and threshold. The approximation is improved if we add to the hazard function a probability current proportional to the derivative of the voltage. Stochastic resonance in response to periodic input occurs in both noise models and exhibits similar characteristics.
Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input
, 1998
"... How does a neuron vary its mean output firing rate if the input changes from random to oscillatory coherent but noisy activity? What are the critical parameters of the neuronal dynamics and input statistics? To answer these questions, we investigate the coincidencedetection properties of an integra ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
How does a neuron vary its mean output firing rate if the input changes from random to oscillatory coherent but noisy activity? What are the critical parameters of the neuronal dynamics and input statistics? To answer these questions, we investigate the coincidencedetection properties of an integrateandfire neuron. We derive an expression indicating how coincidence detection depends on neuronal parameters. Specifically, we show how coincidence detection depends on the shape of the postsynaptic response function, the number of synapses, and the input statistics, and we demonstrate that there is an optimal threshold. Our considerations can be used to predict from neuronal parameters whether and to what extent a neuron can act as a coincidence detector and thus can convert a temporal code into a rate code.
On the Complexity of Learning for a Spiking Neuron
, 1997
"... ) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captu ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captured by the traditional neuron models such as sigmoidal and threshold gates (or "Perceptrons"). We introduce a simple model of a spiking neuron that, in addition to the weights that model the plasticity of synaptic strength, also has variable transmission delays between neurons as programmable parameters. For coding of input and output values two modes are taken into account: binary coding for the Boolean and analog coding for the realvalued domain. We investigate the complexity of learning for a single spiking neuron within the framework of PAClearnability. With regard to sample complexity, we prove that the VCdimension is \Theta(n log n) and, hence, strictly larger than that of a thresho...
Spontaneous dynamics of asymmetric random recurrent spiking neural networks
 Neural Computation
, 2006
"... In this paper we will study the effect of a unique initial stimulation on random recurrent networks of leaky integrate and fire neurons. Indeed, given a stochastic connectivity, this socalled spontaneous mode exhibits various non trivial dynamics. This study is based on a mathematical formalism tha ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
In this paper we will study the effect of a unique initial stimulation on random recurrent networks of leaky integrate and fire neurons. Indeed, given a stochastic connectivity, this socalled spontaneous mode exhibits various non trivial dynamics. This study is based on a mathematical formalism that allows us to examine the variability of the afterward dynamics according to the parameters of the weight distribution. Under independence hypothesis (e.g. in the case of very large networks), we are able to compute the average number of neurons that fire at a given time – the spiking activity. In accordance with numerical simulations, we will prove that this spiking activity reaches a steadystate. We will characterize this steadystate and explore the transients. 1