Results 1  10
of
109
Generalized IntegrateandFire Models of Neuronal Activity Approximate Spike Trains of a . . .
"... We demonstrate that singlevariable integrateandfire models can quantitatively capture the dynamics of a physiologicallydetailed model for fastspiking cortical neurons. Through a systematic set of approximations, we reduce the conductance based model to two variants of integrateandfire mode ..."
Abstract

Cited by 84 (16 self)
 Add to MetaCart
We demonstrate that singlevariable integrateandfire models can quantitatively capture the dynamics of a physiologicallydetailed model for fastspiking cortical neurons. Through a systematic set of approximations, we reduce the conductance based model to two variants of integrateandfire models. In the first variant (nonlinear integrateandfire model), parameters depend on the instantaneous membrane potential whereas in the second variant, they depend on the time elapsed since the last spike (Spike Response Model). The direct reduction links features of the simple models to biophysical features of the full conductance based model. To quantitatively
Maximum likelihood estimation of a stochastic integrateandfire neural encoding model
, 2004
"... We examine a cascade encoding model for neural response in which a linear filtering stage is followed by a noisy, leaky, integrateandfire spike generation mechanism. This model provides a biophysically more realistic alternative to models based on Poisson (memoryless) spike generation, and can eff ..."
Abstract

Cited by 83 (24 self)
 Add to MetaCart
(Show Context)
We examine a cascade encoding model for neural response in which a linear filtering stage is followed by a noisy, leaky, integrateandfire spike generation mechanism. This model provides a biophysically more realistic alternative to models based on Poisson (memoryless) spike generation, and can effectively reproduce a variety of spiking behaviors seen in vivo. We describe the maximum likelihood estimator for the model parameters, given only extracellular spike train responses (not intracellular voltage data). Specifically, we prove that the log likelihood function is concave and thus has an essentially unique global maximum that can be found using gradient ascent techniques. We develop an efficient algorithm for computing the maximum likelihood solution, demonstrate the effectiveness of the resulting estimator with numerical simulations, and discuss a method of testing the model’s validity using timerescaling and density evolution techniques.
Characterization of Neural Responses with Stochastic Stimuli
 TO APPEAR IN: THE NEW COGNITIVE NEUROSCIENCES, 3RD EDITION EDITOR: M. GAZZANIGA
, 2004
"... ose response properties are not at least partially known in advance. This chapter provides an overview of some recently developed characterization methods. In general, the ingredients of the problem are: (a) the selection of a set of experimental stimuli; (b) selection of a model of response; (c) a ..."
Abstract

Cited by 72 (27 self)
 Add to MetaCart
ose response properties are not at least partially known in advance. This chapter provides an overview of some recently developed characterization methods. In general, the ingredients of the problem are: (a) the selection of a set of experimental stimuli; (b) selection of a model of response; (c) a procedure for fitting (estimation) of the model. We discuss solutions of this problem that combine stochastic stimuli with models based on an initial linear filtering stage that serves to reduce the dimensionality of the stimulus space. We begin by describing classical reverse correlation in this context, and then discuss several recent generalizations that increase the power and flexibility of this basic method. Thanks to Brian Lau, Dario Ringach, Nicole Rust, and Brian Wandell for helpful comments on the manuscript. This work was funded by the Howard Hughes Medical Institute, and the SloanSwartz Center for Theoretical Visual Neuroscience at New York University. 1 Reverse correlation M
Prediction and Decoding of Retinal Ganglion Cell Responses with a Probabilistic Spiking Model
, 2005
"... ... generation. We show that the stimulus selectivity, reliability, and timing precision of primate retinal ganglion cell (RGC) light responses can be reproduced accurately with a simple model consisting of a leaky integrateandfire spike generator driven by a linearly filtered stimulus, a postspik ..."
Abstract

Cited by 66 (21 self)
 Add to MetaCart
... generation. We show that the stimulus selectivity, reliability, and timing precision of primate retinal ganglion cell (RGC) light responses can be reproduced accurately with a simple model consisting of a leaky integrateandfire spike generator driven by a linearly filtered stimulus, a postspike current, and a Gaussian noise current. We fit model parameters for individual RGCs by maximizing the likelihood of observed spike responses to a stochastic visual stimulus. Although compact, the fitted model predicts the detailed time structure of responses to novel stimuli, accurately capturing the interaction between the spiking history and sensory stimulus selectivity. The model also accounts for the variability in responses to repeated stimuli, even when fit to data from a single (nonrepeating) stimulus sequence. Finally, the model can be used to derive an explicit, maximumlikelihood decoding rule for neural spike trains, thus providing a tool for assessing the limitations that spiking variability imposes on sensory performance.
Fast and slow contrast adaptation in retinal circuitry
 Neuron
, 2002
"... The visual system adapts to the magnitude of intensity fluctuations, and this process begins in the retina. Following the switch from a lowcontrast environment to one of high contrast, ganglion cell sensitivity declines in two distinct phases: a fast change occurs in �0.1 s, and a slow decrease ov ..."
Abstract

Cited by 58 (1 self)
 Add to MetaCart
(Show Context)
The visual system adapts to the magnitude of intensity fluctuations, and this process begins in the retina. Following the switch from a lowcontrast environment to one of high contrast, ganglion cell sensitivity declines in two distinct phases: a fast change occurs in �0.1 s, and a slow decrease over �10 s. To examine where these modulations arise, we recorded intracellularly from every major cell type in the salamander retina. Certain bipolar and amacrine cells, and all ganglion cells, adapted to contrast. Generally, these neurons showed both fast and slow adaptation. Fast effects of a contrast increase included accelerated kinetics, decreased sensitivity, and a depolarization of the baseline membrane potential. Slow adaptation did not affect kinetics, but produced a gradual hyperpolarization. This hyperpolarization can account for slow adaptation in the spiking output of ganglion cells.
Sequential optimal design of neurophysiology experiments
, 2008
"... Adaptively optimizing experiments has the potential to significantly reduce the number of trials needed to build parametric statistical models of neural systems. However, application of adaptive methods to neurophysiology has been limited by severe computational challenges. Since most neurons are hi ..."
Abstract

Cited by 42 (8 self)
 Add to MetaCart
(Show Context)
Adaptively optimizing experiments has the potential to significantly reduce the number of trials needed to build parametric statistical models of neural systems. However, application of adaptive methods to neurophysiology has been limited by severe computational challenges. Since most neurons are high dimensional systems, optimizing neurophysiology experiments requires computing highdimensional integrations and optimizations in real time. Here we present a fast algorithm for choosing the most informative stimulus by maximizing the mutual information between the data and the unknown parameters of a generalized linear model (GLM) which we want to fit to the neuron’s activity. We rely on important logconcavity and asymptotic normality properties of the posterior to facilitate the required computations. Our algorithm requires only lowrank matrix manipulations and a 2dimensional search to choose the optimal stimulus. The average running time of these operations scales quadratically with the dimensionality of the GLM, making realtime adaptive experimental design feasible even for highdimensional stimulus and parameter spaces. For example, we
Modular Stability Tools for Distributed Computation and Control
 TO BE PUBLISHED IN INT. J. ADAPTIVE CONTROL AND SIGNAL PROCESSING, 17(6)
, 2002
"... Much recent functional modelling of the central nervous system, beyond traditional “neural net” approaches, focuses on its distributed computational architecture. This paper discusses extensions of our recent work aimed at understanding this architecture from an overall nonlinear stability and conve ..."
Abstract

Cited by 35 (24 self)
 Add to MetaCart
Much recent functional modelling of the central nervous system, beyond traditional “neural net” approaches, focuses on its distributed computational architecture. This paper discusses extensions of our recent work aimed at understanding this architecture from an overall nonlinear stability and convergence point of view, and at constructing artificial devices exploiting similar modularity. Applications to synchronisation and to schooling are also described. The development makes extensive use of nonlinear contraction theory.
Virtual retina : A biological retina model and simulator, with contrast gain control
 Journal of Computational Neuroscience
"... apport de recherche ..."
(Show Context)
Variability and Information in a Neural Code Of The Cat Lateral . . .
 JOURNAL OF NEUROPHYSIOLOGY 86, 27892806 (2001)
, 2001
"... A central theme in neural coding concerns the role of response variability and noise in determining the information transmission of neurons. This issue was investigated in single cells of the lateral geniculate nucleus of barbiturate anesthetized cats by quantifying the degree of precision in and th ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
A central theme in neural coding concerns the role of response variability and noise in determining the information transmission of neurons. This issue was investigated in single cells of the lateral geniculate nucleus of barbiturate anesthetized cats by quantifying the degree of precision in and the information transmission properties of individual spike train responses to full field, binary (bright or dark), flashing stimuli. We found that neuronal responses could be highly reproducible in their spike timing (about 12 ms standard deviation) and spike count (about 0.3 ratio of variance/mean, compared to 1.0 expected for a Poisson process). This degree of precision only became apparent when an adequate length of the stimulus sequence was specified to determine the neural response, emphasizing that the variables relevant to a cell's response must be controlled in order to observe the cell's intrinsic response precision. Responses could carry as much as 3.5 bits/spike of information about the stimulus, a rate that was within a factor of two of the limit the spike train can transmit. Moreover, there appeared to be little sign of redundancy in coding: on average, longer response sequences carried at least as much information about the stimulus as would be obtained by adding together the information carried by shorter response sequences considered independently. There also was no direct evidence found for synergy between response sequences. These results could largely, but not entirely, be explained by a simple model of the response in which one filters the stimulus by the cell's impulse response kernel, thresholds the result at a fairly high level, and incorporates a postspike refractory period.