Results 1  10
of
18
B: How noisy adaptation of neurons shapes interspike interval histograms and correlations. PLoS Comput Biol 2010, 6(12):e1001026. doi:10.1186/1471220212S1P199 Cite this article as: Schwalger et al.: How stochastic adaptation of neurons shapes interspi
"... Channel noise is the dominant intrinsic noise source of neurons causing variability in the timing of action potentials and interspike intervals (ISI). Slow adaptation currents are observed in many cells and strongly shape response properties of neurons. These currents are mediated by finite populati ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
(Show Context)
Channel noise is the dominant intrinsic noise source of neurons causing variability in the timing of action potentials and interspike intervals (ISI). Slow adaptation currents are observed in many cells and strongly shape response properties of neurons. These currents are mediated by finite populations of ionic channels and may thus carry a substantial noise component. Here we study the effect of such adaptation noise on the ISI statistics of an integrateandfire model neuron by means of analytical techniques and extensive numerical simulations. We contrast this stochastic adaptation with the commonly studied case of a fast fluctuating current noise and a deterministic adaptation current (corresponding to an infinite population of adaptation channels). We derive analytical approximations for the ISI density and ISI serial correlation coefficient for both cases. For fast fluctuations and deterministic adaptation, the ISI density is well approximated by an inverse Gaussian (IG) and the ISI correlations are negative. In marked contrast, for stochastic adaptation, the density is more peaked and has a heavier tail than an IG density and the serial correlations are positive. A numerical study of the mixed case where both fast fluctuations and adaptation channel noise are present reveals a smooth transition between the analytically tractable limiting cases. Our conclusions are furthermore supported by numerical simulations of a biophysically more realistic HodgkinHuxley type model. Our results could be used to infer the dominant source of noise in neurons from their
Multiplicatively interacting point processes and applications to neural modeling, arXiv
, 2010
"... Abstract. We introduce a nonlinear modification of the classical Hawkes process, which allows inhibitory couplings between units without restrictions. The resulting system of interacting point processes provides a useful mathematical model for recurrent networks of spiking neurons with exponential ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce a nonlinear modification of the classical Hawkes process, which allows inhibitory couplings between units without restrictions. The resulting system of interacting point processes provides a useful mathematical model for recurrent networks of spiking neurons with exponential transfer functions. The expected rates of all neurons in the network are approximated by a firstorder differential system. We study the stability of the solutions of this equation, and use the new formalism to implement a robust winnertakesall network that operates robustly for a wide range of parameters. Finally, we discuss relations with the generalised linear model that is widely used for the analysis of spike trains. 1.
Spike train statistics and dynamics with synaptic input from any renewal process: A population density approach
 Neural Comput
, 2009
"... In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physi ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physiological consequences. We extend PDF methods to models with synaptic event times governed by any modulated renewal process. We focus on the integrateandfire neuron with instantaneous synaptic kinetics and a random elementary excitatory postsynaptic potential (EPSP), A. Between presynaptic events, the membrane voltage, v, decays exponentially toward rest, while s, the time since the last synaptic input event, evolves with unit velocity. When a synaptic event arrives, v jumps by A, and s is reset to zero. If v crosses the threshold voltage, an action potential occurs, and v is reset to vreset. The probability per unit time of a synaptic event at time t, given the elapsed time s since the last event, h(s, t), depends on specifics of the renewal process.
Dynamics of sensory processing in the dual olfactory pathway of the honeybee
 Apidologie
, 2012
"... Abstract – Insects identify and evaluate behaviorally relevant odorants in complex natural scenes where odor concentrations and mixture composition can change rapidly. This requires fast and reliable information processing in the olfactory system. Here, we review recent experimental findings and the ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract – Insects identify and evaluate behaviorally relevant odorants in complex natural scenes where odor concentrations and mixture composition can change rapidly. This requires fast and reliable information processing in the olfactory system. Here, we review recent experimental findings and theoretical hypotheses on olfactory processing in the honeybee with a focus on its temporal dynamics. Specifically we address odor response characteristics of antennal lobe interneurons and projection neurons, local processing of elemental odors and odor blends, the functional role of the dual olfactory pathway in the honeybee, population coding in uniglomerular projection neurons, and a novel model for sparse and reliable coding in projection neurons and mushroom body Kenyon cells. It is concluded that the olfactory system of the honeybee implements a fast and reliable coding scheme optimized for processing dynamic input within the behaviorally relevant temporal range.
Applying the Multivariate TimeRescaling Theorem to Neural Population Models
, 2011
"... Statistical models of neural activity are integral to modern neuroscience. Recently interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However, any s ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Statistical models of neural activity are integral to modern neuroscience. Recently interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However, any statistical model must be validated by an appropriate goodnessoffit test. KolmogorovSmirnov tests based on the timerescaling theorem have proven to be useful for evaluating pointprocessbased statistical models of singleneuron spike trains. Here we discuss the extension of the timerescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains,models that neglect couplings between neurons can be erroneously passed by the univariate timerescaling test. We present the multivariate version of the timerescaling theorem and provide a practical stepbystep procedure for applying it to testing the sufficiency of neural population models. Using several simple analytically tractable models and more complex simulated and real data sets, we demonstrate that important features of the population activity can be detected only using the multivariate extension of the test.
Kv7 channels regulate pairwise spiking covariability in health and disease
, 2014
"... Lowthreshold M currents are mediated by the Kv7 family of potassium channels. Kv7 channels are important regulators of spiking activity, having a direct influence on the firing rate, spike time variability, and filter properties of neurons. How Kv7 channels affect the joint spiking activity of pop ..."
Abstract
 Add to MetaCart
(Show Context)
Lowthreshold M currents are mediated by the Kv7 family of potassium channels. Kv7 channels are important regulators of spiking activity, having a direct influence on the firing rate, spike time variability, and filter properties of neurons. How Kv7 channels affect the joint spiking activity of populations of neurons is an important and open area of study. Using a combination of computational simulations and analytic calculations, we show that the activation of Kv7 conductances reduces the covariability between spike trains of pairs of neurons driven by common inputs. This reduction is beyond that explained by the lowering of firing rates and involves an active cancellation of common fluctuations in the membrane potentials of the cell pair. Our theory shows that the excess covariance reduction is due to a Kv7induced shift from lowpass to bandpass filtering of the single neuron spike train response. Dysfunction of Kv7 conductances is related to a number of neurological diseases characterized by both
The state of MIIND
"... MIIND (Multiple Interacting Instantiations of Neural Dynamics) is a highly modular multilevel C++ framework, that aims to shorten the development time for models in Cognitive Neuroscience (CNS). It offers reusable code modules (libraries of classes and functions) aimed at solving problems that occu ..."
Abstract
 Add to MetaCart
(Show Context)
MIIND (Multiple Interacting Instantiations of Neural Dynamics) is a highly modular multilevel C++ framework, that aims to shorten the development time for models in Cognitive Neuroscience (CNS). It offers reusable code modules (libraries of classes and functions) aimed at solving problems that occur repeatedly in modelling, but tries not to impose a specific modelling philosophy or methodology. At the lowest level, it offers support for the implementation of sparse networks. For example, the library SparseImplementationLib supports sparse random networks Preprint submitted to Elsevier 9 June 2008and the library LayerMappingLib can be used for sparse regular networks of filterlike operators. The library DynamicLib, which builds on top of the library SparseImplementationLib, offers a generic framework for simulating network processes. Presently, several specific network process implementations are provided in MIIND: WilsonCowan and Ornstein Uhlenbeck type, and population density techniques for leakyintegrateandfire neurons driven by Poisson input. A design principle of MIIND is to support detailing: the refinement of an originally simple model into a form
Journal of Computational Neuroscience manuscript No. (will be inserted by the editor)
"... Selfsustained asynchronous irregular states and Up–Down states in thalamic, cortical and thalamocortical networks of nonlinear integrateandfire neurons ..."
Abstract
 Add to MetaCart
Selfsustained asynchronous irregular states and Up–Down states in thalamic, cortical and thalamocortical networks of nonlinear integrateandfire neurons
Synaptic “noise”: Experiments, computational consequences and methods to analyze experimental data
, 2008
"... In the cerebral cortex of awake animals, neurons are subject to a tremendous fluctuating activity mostly of synaptic origin and termed “synaptic noise”. Synaptic noise is the dominant source of membrane potential fluctuations in neurons and can have a strong influence on their integrative properties ..."
Abstract
 Add to MetaCart
In the cerebral cortex of awake animals, neurons are subject to a tremendous fluctuating activity mostly of synaptic origin and termed “synaptic noise”. Synaptic noise is the dominant source of membrane potential fluctuations in neurons and can have a strong influence on their integrative properties. We review here the experimental measurements of synaptic noise, and its modeling by conductancebased stochastic processes. We next review the consequences of synaptic noise on neuronal integrative properties, as predicted by computational models and investigated experimentally using dynamicclamp. We also review analysis methods, such as spiketriggered average or conductance analysis, which are derived from the modeling of synaptic noise by stochastic processes. These different approaches aim at understanding the integrative properties of neocortical neurons in the intact brain. 1