Results 1  10
of
61
Simulation of networks of spiking neurons: A review of tools and strategies
 Journal of Computational Neuroscience
, 2007
"... We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on ..."
Abstract

Cited by 108 (29 self)
 Add to MetaCart
We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including HodgkinHuxley type, integrateandfire models, interacting with currentbased or conductancebased synapses, using clockdriven or eventdriven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given
Dynamically reconfigurable silicon array of spiking neurons with conductancebased synapses
 IEEE Transactions on Neural Networks
, 2007
"... Abstract—A mixedsignal very large scale integration (VLSI) chip for large scale emulation of spiking neural networks is presented. The chip contains 2400 silicon neurons with fully programmable and reconfigurable synaptic connectivity. Each neuron implements a discretetime model of a singlecompar ..."
Abstract

Cited by 33 (11 self)
 Add to MetaCart
(Show Context)
Abstract—A mixedsignal very large scale integration (VLSI) chip for large scale emulation of spiking neural networks is presented. The chip contains 2400 silicon neurons with fully programmable and reconfigurable synaptic connectivity. Each neuron implements a discretetime model of a singlecompartment cell. The model allows for analog membrane dynamics and an arbitrary number of synaptic connections, each with tunable conductance and reversal potential. The array of silicon neurons functions as an address–event (AE) transceiver, with incoming and outgoing spikes communicated over an asynchronous eventdriven digital bus. Address encoding and conflict resolution of spiking events are implemented via a randomized arbitration scheme that ensures balanced servicing of event requests across the array. Routing of events is implemented externally using dynamically programmable randomaccess memory that stores a postsynaptic address, the conductance, and the reversal potential of each synaptic connection. Here, we describe the silicon neuron circuits, present experimental data characterizing the 3 mm 2 3 mm chip fabricated in 0.5 m complementary metal–oxide–semiconductor (CMOS) technology, and demonstrate its utility by configuring the hardware to emulate a model of attractor dynamics and waves of neural activity during sleep in rat hippocampus. Index Terms—Address–event representation (AER), dynamically reconfigurable network, membrane conductance, mixedsignal very large scale integration (VLSI), neural emulator, neurotransmitter quantal release, switched capacitor. I.
Exact simulation of integrateandfire models with synaptic conductances
 Neural Comp
, 2006
"... Computational neuroscience relies heavily on the simulation of large networks of neuron models. There are essentially two simulation strategies: 1) using an approximation method (e.g. RungeKutta) with spike times binned to the time step; 2) calculating spike times exactly in an eventdriven fashion ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
(Show Context)
Computational neuroscience relies heavily on the simulation of large networks of neuron models. There are essentially two simulation strategies: 1) using an approximation method (e.g. RungeKutta) with spike times binned to the time step; 2) calculating spike times exactly in an eventdriven fashion. In large networks, the computation time of the best algorithm for either strategy scales linearly with the number of synapses, but each strategy has its own assets and constraints: approximation methods can be applied to any model but are inexact; exact simulation avoids numerical artefacts but is limited to simple models. Previous work has focused on improving the accuracy of approximation methods. In this paper we extend the range of models that can be simulated exactly to a more realistic model, namely an integrateandfire model with exponential synaptic conductances.
The Cat is Out of the Bag: Cortical Simulations with 10 9 Neurons, 10 13 Synapses
"... In the quest for cognitive computing, we have built a massively parallel cortical simulator, C2, that incorporates a number of innovations in computation, memory, and communication. Using C2 on LLNL’s Dawn Blue Gene/P supercomputer with 147, 456 CPUs and 144 TB of main memory, we report two cortical ..."
Abstract

Cited by 28 (4 self)
 Add to MetaCart
(Show Context)
In the quest for cognitive computing, we have built a massively parallel cortical simulator, C2, that incorporates a number of innovations in computation, memory, and communication. Using C2 on LLNL’s Dawn Blue Gene/P supercomputer with 147, 456 CPUs and 144 TB of main memory, we report two cortical simulations – at unprecedented scale – that effectively saturate the entire memory capacity and refresh it at least every simulated second. The first simulation consists of 1.6 billion neurons and 8.87 trillion synapses with experimentallymeasured gray matter thalamocortical connectivity. The second simulation has 900 million neurons and 9 trillion synapses with probabilistic connectivity. We demonstrate nearly perfect weak scaling and attractive strong scaling. The simulations, which incorporate phenomenological spiking neurons, individual learning synapses, axonal delays, and dynamic synaptic channels, exceed the scale of the cat cortex, marking the dawn of a new era in the scale of cortical simulations. 1.
Correlations and population dynamics in cortical networks
 Neural Comput
, 2008
"... Kriener et al. The function of cortical networks depends on the collective interplay between neurons and neuronal populations, which is reflected in the correlation of signals that can be recorded at different levels. To correctly interpret these observations it is important to understand the origin ..."
Abstract

Cited by 26 (13 self)
 Add to MetaCart
(Show Context)
Kriener et al. The function of cortical networks depends on the collective interplay between neurons and neuronal populations, which is reflected in the correlation of signals that can be recorded at different levels. To correctly interpret these observations it is important to understand the origin of neuronal correlations. Here we study how cells in large recurrent networks of excitatory and inhibitory neurons interact and how the associated correlations affect stationary states of idle network activity. We demonstrate that the structure of the connectivity matrix of such networks induces considerable correlations between synaptic currents as well as between subthreshold membrane potentials, provided Dale’s principle is respected.If, in contrast, synaptic weights are randomly distributed, input correlations can vanish, even for densely connected networks. Although correlations are strongly attenuated when proceding from membrane potentials to action potentials (spikes), the resulting weak correlations in the spike output can cause substantial fluctuations in the population activity, even in highly diluted networks. We show that simple meanfield models that take the structure of the coupling matrix into account can adequately describe the power spectra of the population activity. The consequences of Dale’s principle on correlations and rate fluctuations are discussed in the light of recent experimental findings. 1
Anatomy of a Cortical Simulator
"... Insights into brain’s highlevel computational principles will lead to novel cognitive systems, computing architectures, programming paradigms, and numerous practical applications. An important step towards this end is the study of large networks of cortical spiking neurons. We have built a cortical ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Insights into brain’s highlevel computational principles will lead to novel cognitive systems, computing architectures, programming paradigms, and numerous practical applications. An important step towards this end is the study of large networks of cortical spiking neurons. We have built a cortical simulator, C2, incorporating several algorithmic enhancements to optimize the simulation scale and time, through: computationally efficient simulation of neurons in a clockdriven and synapses in an eventdriven fashion; memory efficient representation of simulation state; and communication efficient message exchanges. Using phenomenological, singlecompartment models of spiking neurons and synapses with spiketiming dependent plasticity, we represented a ratscale cortical model (55 million neurons, 442 billion synapses) in 8TB memory of a 32,768processor BlueGene/L. With 1 millisecond resolution for neuronal dynamics and 120 milliseconds axonal delays, C2 can simulate 1 second of model time in 9 seconds per Hertz of average neuronal firing rate. In summary, by combining stateoftheart hardware with innovative algorithms and software design, we simultaneously achieved unprecedented timetosolution on an unprecedented problem size. 1.
Introducing numerical bounds to improve eventbased neural network simulation, 2009, http://hal.inria.fr/inria00382534/en/, RR6924, Rapport de recherche
"... apport de recherche ..."
(Show Context)
To which extend is the ”neural code” a metric
 In Neurocomp
, 2008
"... Here is proposed a review of the different choices to structure spike trains, using deterministic metrics. Temporal constraints observed in biological or computational spike trains are first taken into account The relation with existing neural codes (rate coding, rank coding, phase coding,..) is the ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
(Show Context)
Here is proposed a review of the different choices to structure spike trains, using deterministic metrics. Temporal constraints observed in biological or computational spike trains are first taken into account The relation with existing neural codes (rate coding, rank coding, phase coding,..) is then discussed. To which extend the “neural code ” contained in spike trains is related to a metric appears to be a key point, a generalization of the VictorPurpura metric family being proposed for temporal constrained causal spike trains.
Optimal control of transient dynamics in balanced networks supports generation
"... of complex movements ..."
Efficient simulation of largescale spiking neural networks using cuda graphics processors
 in Proceedings of the 2009 international joint conference on Neural Networks, ser. IJCNN’09. Piscataway
, 2009
"... Abstract—Neural network simulators that take into account the spiking behavior of neurons are useful for studying brain mechanisms and for engineering applications. Spiking Neural Network (SNN) simulators have been traditionally simulated on largescale clusters, supercomputers, or on dedicated har ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Neural network simulators that take into account the spiking behavior of neurons are useful for studying brain mechanisms and for engineering applications. Spiking Neural Network (SNN) simulators have been traditionally simulated on largescale clusters, supercomputers, or on dedicated hardware architectures. Alternatively, Graphics Processing Units (GPUs) can provide a lowcost, programmable, and highperformance computing platform for simulation of SNNs. In this paper we demonstrate an efficient, Izhikevich neuron based largescale SNN simulator that runs on a single GPU. The GPUSNN model (running on an NVIDIA GTX280 with 1GB of memory), is up to 26 times faster than a CPU version for the simulation of 100K neurons with 50 Million synaptic connections, firing at an average rate of 7Hz. For simulation of 100K neurons with 10 Million synaptic connections, the GPUSNN model is only 1.5 times slower than realtime. Further, we present a collection of new techniques related to parallelism extraction, mapping of irregular communication, and compact network representation for effective simulation of SNNs on GPUs. The fidelity of the simulation results were validated against CPU simulations using firing rate, synaptic weight distribution, and interspike interval analysis. We intend to make our simulator available to the modeling community so that researchers will have easy access to largescale SNN simulations.