Results 1  10
of
67
On dynamics of integrateandfire neural networks with adaptive conductances
 Frontiers in Neuroscience
, 2008
"... We present a mathematical analysis of a networks with IntegrateandFire neurons with conductance based synapses. Taking into account the realistic fact that the spike time is only known within some finite precision, we propose a model where spikes are effective at times multiple of a characteristic ..."
Abstract

Cited by 21 (11 self)
 Add to MetaCart
We present a mathematical analysis of a networks with IntegrateandFire neurons with conductance based synapses. Taking into account the realistic fact that the spike time is only known within some finite precision, we propose a model where spikes are effective at times multiple of a characteristic time scale δ, where δ can be arbitrary small (in particular, well beyond the numerical precision). We make a complete mathematical characterization of the modeldynamics and obtain the following results. The asymptotic dynamics is composed by finitely many stable periodic orbits, whose number and period can be arbitrary large and can diverge in a region of the synaptic weights space, traditionally called the “edge of chaos”, a notion mathematically well defined in the present paper. Furthermore, except at the edge of chaos, there is a onetoone correspondence between the membrane potential trajectories and the raster plot. This shows that the neural code is entirely “in the spikes ” in this case. As a key tool, we introduce an order parameter, easy to compute numerically, and closely related to a natural notion of entropy, providing a relevant characterization of the computational capabilities of the network. This allows us to compare the computational capabilities of leaky and IntegrateandFire models and conductance based models. The present study considers networks with constant input, and without timedependent plasticity, but the framework has been designed for both extensions.
Interoperability of Neuroscience Modeling Software: Current Status and Future Directions
, 2007
"... Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is in ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is in
A modified cable formalism for modeling neuronal membranes at high frequencies
 Biophys. J
, 2008
"... ABSTRACT Intracellular recordings of cortical neurons in vivo display intense subthreshold membrane potential (Vm) activity. The power spectral density of the Vm displays a powerlaw structure at high frequencies (.50 Hz) with a slope of; 2.5. This type of frequency scaling cannot be accounted for b ..."
Abstract

Cited by 10 (8 self)
 Add to MetaCart
ABSTRACT Intracellular recordings of cortical neurons in vivo display intense subthreshold membrane potential (Vm) activity. The power spectral density of the Vm displays a powerlaw structure at high frequencies (.50 Hz) with a slope of; 2.5. This type of frequency scaling cannot be accounted for by traditional models, as either singlecompartment models or models based on reconstructed cell morphologies display a frequency scaling with a slope close to 4. This slope is due to the fact that the membrane resistance is shortcircuited by the capacitance for high frequencies, a situation which may not be realistic. Here, we integrate nonideal capacitors in cable equations to reflect the fact that the capacitance cannot be charged instantaneously. We show that the resulting nonideal cable model can be solved analytically using Fourier transforms. Numerical simulations using a ballandstick model yield membrane potential activity with similar frequency scaling as in the experiments. We also discuss the consequences of using nonideal capacitors on other cellular properties such as the transmission of high frequencies, which is boosted in nonideal cables, or voltage attenuation in dendrites. These results suggest that cable equations based on nonideal capacitors should be used to capture the behavior of neuronal membranes at high frequencies.
A Master Equation Formalism for Macroscopic Modeling of Asynchronous Irregular Activity States
, 2009
"... Many efforts have been devoted to modeling asynchronous irregular (AI) activity states, which resemble the complex activity states seen in the cerebral cortex of awake animals. Most of models have considered balanced networks of excitatory and inhibitory spiking neurons in which AI states are sustai ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Many efforts have been devoted to modeling asynchronous irregular (AI) activity states, which resemble the complex activity states seen in the cerebral cortex of awake animals. Most of models have considered balanced networks of excitatory and inhibitory spiking neurons in which AI states are sustained through recurrent sparse connectivity, with or without external input. In this letter we propose a mesoscopic description of such AI states. Using master equation formalism, we derive a secondorder meanfield set of ordinary differential equations describing the temporal evolution of randomly connected balanced networks. This formalism takes into account finite size effects and is applicable to any neuron model as long as its transfer function can be characterized. We compare the predictions of this approach with numerical simulations for different network configurations and parameter spaces. Considering the randomly connected network as a unit, this approach could be used to build largescale networks of such connected units, with an aim to model activity states constrained by macroscopic measurements, such as voltagesensitive dye imaging.
Introducing numerical bounds to improve eventbased neural network simulation, 2009, http://hal.inria.fr/inria00382534/en/, RR6924, Rapport de recherche
"... apport de recherche ..."
Compass: A scalable simulator for an architecture for Cognitive Computing
"... Abstract—Inspired by the function, power, and volume of the organic brain, we are developing TrueNorth, a novel modular, nonvon Neumann, ultralow power, compact architecture. TrueNorth consists of a scalable network of neurosynaptic cores, with each core containing neurons, dendrites, synapses, an ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract—Inspired by the function, power, and volume of the organic brain, we are developing TrueNorth, a novel modular, nonvon Neumann, ultralow power, compact architecture. TrueNorth consists of a scalable network of neurosynaptic cores, with each core containing neurons, dendrites, synapses, and axons. To set sail for TrueNorth, we developed Compass, a multithreaded, massively parallel functional simulator and a parallel compiler that maps a network of longdistance pathways in the macaque monkey brain to TrueNorth. We demonstrate nearperfect weak scaling on a 16 rack IBM ® Blue Gene®/Q (262144 CPUs, 256 TB memory), achieving an unprecedented scale of 256 million neurosynaptic cores containing 65 billion neurons and 16 trillion synapses running only 388 × slower than real time with an average spiking rate of 8.1 Hz. By using emerging PGAS communication primitives, we also demonstrate 2 × better realtime performance over MPI primitives on a 4 rack Blue Gene/P (16384 CPUs, 16 TB memory). I.
A view of Neural Networks as dynamical systems
 in "International Journal of Bifurcations and Chaos", 2009, http://lanl.arxiv.org/abs/0901.2203
"... We present some recent investigations resulting from the modelling of neural networks as dynamical systems, and dealing with the following questions, adressed in the context of specific models. (i). Characterizing the collective dynamics; (ii). Statistical analysis of spikes trains; (iii). Interplay ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We present some recent investigations resulting from the modelling of neural networks as dynamical systems, and dealing with the following questions, adressed in the context of specific models. (i). Characterizing the collective dynamics; (ii). Statistical analysis of spikes trains; (iii). Interplay between dynamics and network structure; (iv). Effects of synaptic plasticity.
Computing with Spiking Neuron Networks
"... Abstract Spiking Neuron Networks (SNNs) are often referred to as the 3 rd generation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an accurate modeling of synaptic interactions between neuron ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract Spiking Neuron Networks (SNNs) are often referred to as the 3 rd generation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an accurate modeling of synaptic interactions between neurons, taking into account the time of spike firing. SNNs overcome the computational power of neural networks made of threshold or sigmoidal units. Based on dynamic eventdriven processing, they open up new horizons for developing models with an exponential capacity of memorizing and a strong ability to fast adaptation. Today, the main challenge is to discover efficient learning rules that might take advantage of the specific features of SNNs while keeping the nice properties (generalpurpose, easytouse, available simulators, etc.) of traditional connectionist models. This chapter relates the history of the “spiking neuron ” in Section 1 and summarizes the most currentlyinuse models of neurons and synaptic plasticity in Section 2. The computational power of SNNs is addressed in Section 3 and the problem of learning in networks of spiking neurons is tackled in Section 4, with insights into the tracks currently explored for solving it. Finally, Section 5 discusses application domains, implementation issues and proposes several simulation frameworks.
Are neuronal networks that vicious ? or only their models ? Neural computation
, 2007
"... We present a mathematical analysis of a network with integrate and fire neurons, taking into account the realistic fact that the spike time is only known within some finite precision. This leads us to propose a model where spikes are effective at times multiple of a characteristic time scale δ, wher ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We present a mathematical analysis of a network with integrate and fire neurons, taking into account the realistic fact that the spike time is only known within some finite precision. This leads us to propose a model where spikes are effective at times multiple of a characteristic time scale δ, where δ can be mathematically arbitrary small. We make a complete mathematical characterization of the modeldynamics for conductance based integrate and fire models. We obtain the following results. The asymptotic dynamics is composed by finitely many periodic orbits, whose number and period can be arbitrary large and diverge in a region of the parameters space, traditionally called the “edge of chaos”, a notion mathematically well defined in the present paper. Furthermore, except at the edge of chaos, there is a onetoone correspondence between the membrane potential trajectories and the spikes raster plot. This shows that the neural code is entirely “in the spikes ” in this case. As a key tool, we introduce an order parameter, easy to compute numerically, and closely related to a natural notion of entropy providing a relevant characterization of the computational capabilities of the network. This allows us to compare the computational capabilities of leaky and integrate and fire models and conductance based models. The present study considers networks with constant input, and without timedependent plasticity, but the framework has been designed for both extensions.