Results 1  10
of
50
The effect of correlated variability on the accuracy of a population code
 Neural Computation
, 1999
"... We study the impact of correlated neuronal firing rate variability on the accuracy with which an encoded quantity can be extracted from a population of neurons. Contrary to a widespread belief, correlations in the variabilities of neuronal firing rates do not, in general, limit the increase in codin ..."
Abstract

Cited by 137 (2 self)
 Add to MetaCart
We study the impact of correlated neuronal firing rate variability on the accuracy with which an encoded quantity can be extracted from a population of neurons. Contrary to a widespread belief, correlations in the variabilities of neuronal firing rates do not, in general, limit the increase in coding accuracy provided by using large populations of encoding neurons. Furthermore, in some cases, but not all, correlations improve the accuracy of a population code.
Chaotic Balanced State in a Model of Cortical Circuits
 NEURAL COMPUT
, 1998
"... The nature and origin of the temporal irregularity in the electrical activity of cortical neurons in vivo are still not well understood. We consider the hypothesis that this irregularity is due to a balance of excitatory and inhibitory currents into the cortical cells. We study a network model w ..."
Abstract

Cited by 83 (1 self)
 Add to MetaCart
The nature and origin of the temporal irregularity in the electrical activity of cortical neurons in vivo are still not well understood. We consider the hypothesis that this irregularity is due to a balance of excitatory and inhibitory currents into the cortical cells. We study a network model with excitatory and inhibitory populations of simple binary units. The internal feedback is mediated by relatively large synaptic strengths, so that the magnitude of the total excitatory as well as inhibitory feedback is much larger than the neuronal threshold. The connectivity is random and sparse. The mean number of connections per unit is large but small compared to the total number of cells in the network. The network also receives a large, temporally regular input from external sources. An analytical solution of the meanfield theory of this model which is exact in the limit of large network size is presented. This theory reveals a new cooperative stationary state of large networks, which we term a balanced state. In this state, a balance between the excitatory and inhibitory inputs emerges dynamically for a wide range of parameters, resulting in a net input whose temporal fluctuations are of the same order as its mean. The internal synaptic inputs act as a strong negative feedback, which linearizes the population responses to the external drive despite the strong nonlinearity of the individual cells. This feedback also greatly stabilizes 1 the system's state and enables it to track a timedependent input on time scales much shorter than the time constant of a single cell. The spatiotemporal statistics of the balanced state is calculated. It is shown that the autocorrelations decay on a short time scale yielding an approximate Poissonian temporal s...
Probabilistic Interpretation of Population Codes
, 1998
"... We present a general encodingdecoding framework for interpreting the activity of a population of units. A standard population code interpretation method, the Poisson model, starts from a description as to how a single value of an underlying quantity can generate the activities of each unit in the p ..."
Abstract

Cited by 77 (14 self)
 Add to MetaCart
We present a general encodingdecoding framework for interpreting the activity of a population of units. A standard population code interpretation method, the Poisson model, starts from a description as to how a single value of an underlying quantity can generate the activities of each unit in the population. In casting it in the encodingdecoding framework, we find that this model is too restrictive to describe fully the activities of units in population codes in higher processing areas, such as the medial temporal area. Under a more powerful model, the population activity can convey information not only about a single value of some quantity but also about its whole distribution, including its variance, and perhaps even the certainty the system has in the actual presence in the world of the entity generating this quantity. We propose a novel method for forming such probabilistic interpretations of population codes and compare it to the existing method.
Statistically Efficient Estimation Using Population Coding
, 1998
"... Coarse codes are widely used throughout the brain to encode sensory and motor variables. Methods designed to interpret these codes, such as population vector analysis, are either inefficient (the variance of the estimate is much larger than the smallest possible variance) or biologically implausible ..."
Abstract

Cited by 57 (9 self)
 Add to MetaCart
Coarse codes are widely used throughout the brain to encode sensory and motor variables. Methods designed to interpret these codes, such as population vector analysis, are either inefficient (the variance of the estimate is much larger than the smallest possible variance) or biologically implausible, like maximum likelihood. Moreover, these methods attempt to compute a scalar or vector estimate of the encoded variable. Neurons are faced with a similar estimation problem. They must read out the responses of the presynaptic neurons, but, by contrast, they typically encode the variable with a further population code rather than as a scalar. We show how a nonlinear recurrent network can be used to perform estimation in a nearoptimal way while keeping the estimate in a coarse code format. This work suggests that lateral connections in the cortex may be involved in cleaning up uncorrelated noise among neurons representing similar variables.
Dynamics of Membrane Excitability Determine Interspike Interval Variability: A Link Between Spike Generation Mechanisms and Cortical Spike Train Statistics
, 1998
"... We propose a biophysical mechanism for the high interspike interval variability observed in cortical spike trains. The key lies in the nonlinear dynamics of cortical spike generation, which are consistent with type I membranes where saddlenode dynamics underlie excitability (Rinzel & Ermentrout, 19 ..."
Abstract

Cited by 37 (4 self)
 Add to MetaCart
We propose a biophysical mechanism for the high interspike interval variability observed in cortical spike trains. The key lies in the nonlinear dynamics of cortical spike generation, which are consistent with type I membranes where saddlenode dynamics underlie excitability (Rinzel & Ermentrout, 1989). We present a canonical model for type I membranes, the θneuron. The θneuron is a phase model whose dynamics reflect salient features of type I membranes. This model generates spike trains with coefficient of variation (CV) above 0.6 when brought to firing by noisy inputs. This happens because the timing of spikes for a type I excitable cell is exquisitely sensitive to the amplitude of the suprathreshold stimulus pulses. A noisy input current, giving random amplitude “kicks” to the cell, evokes highly irregular firing across a wide range of firing rates; an intrinsically oscillating cell gives regular spike trains. We corroborate the results with simulations of the MorrisLecar (ML) neural model with random synaptic inputs: type I ML yields high CVs. When this model is modified to have type II dynamics (periodicity arises via a Hopf bifurcation), however, it gives regular spike trains (CV below 0.3). Our results suggest that the high CV values such as those observed in cortical spike trains are an intrinsic characteristic of type I membranes driven to firing by “random” inputs. In contrast, neural oscillators or neurons exhibiting type II excitability should produce regular spike trains.
Ion Channel Stochasticity May Be Critical in Determining the Reliability and Precision of Spike Timing
, 1998
"... This memory is embedded in the distribution of channel states in the spike initiation site. The nature and resolution of this memory depend on the size of the channel pool and on the kinetics and number of states of the channels. We hypothesize that the number of channels in the spike initiation zon ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
This memory is embedded in the distribution of channel states in the spike initiation site. The nature and resolution of this memory depend on the size of the channel pool and on the kinetics and number of states of the channels. We hypothesize that the number of channels in the spike initiation zone may be optimized in some sense to give the reliability and accuracy discussed above, together with a shortterm memory of the neuron's activity. In this context, it is interesting to mention the work of Marder, Abbott, Turrigiano, Liu, and Golowasch (1996) and Abbott et al. (1996), which demonstrates activitydependent longterm changes in the properties of intrinsic membrane currents.
Coding of timevarying signals in spike trains of linear and halfwave rectifying neurons. Network Comput Neural Syst 7:61–85
, 1996
"... Abstract. The encoding of timevarying stimuli in linear and halfwave rectifying neurons is studied. The information carried in single spike trains is assessed by reconstructing part of the stimulus using mean square estimation methods. For the class of models considered here, the mean square error ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Abstract. The encoding of timevarying stimuli in linear and halfwave rectifying neurons is studied. The information carried in single spike trains is assessed by reconstructing part of the stimulus using mean square estimation methods. For the class of models considered here, the mean square error in the reconstructions and estimates of the rate of information transmission are computed analytically. The optimal encoding of stimuli having statistical properties of natural images predicts a change in the temporal filtering characteristics with mean firing rate. This change relates to those observed experimentally at the early stages of visual processing. The transmission of information by model neurons is shown to be fundamentally limited to a maximum of 1.13 bit/spike and it is conjectured that nonlinear processing is necessary to explain higher rates which have been observed experimentally in certain preparations. In spite of the fact that single neurons might not transmit information efficiently, a substantial part of a timevarying stimulus can be recovered from single spike trains. In particular, our results demonstrate that a small number of ‘noisy ’ neurons can carry precise temporal information in their spike trains. 1.
ErrorBackpropagation in Temporally Encoded Networks of Spiking Neurons
 Neurocomputing
, 2000
"... For a network of spiking neurons that encodes information in the timing of individual spiketimes, we derive a supervised learning rule, SpikeProp, akin to traditional errorbackpropagation and show how to overcome the discontinuities introduced by thresholding. With this algorithm, we demonstrate h ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
For a network of spiking neurons that encodes information in the timing of individual spiketimes, we derive a supervised learning rule, SpikeProp, akin to traditional errorbackpropagation and show how to overcome the discontinuities introduced by thresholding. With this algorithm, we demonstrate how networks of spiking neurons with biologically reasonable action potentials can perform complex nonlinear classification in fast temporal coding just as well as ratecoded networks. We perform experiments for the classical XORproblem, when posed in a temporal setting, as well as for a number of other benchmark datasets. Comparing the (implicit) number of spiking neurons required for the encoding of the interpolated XOR problem, it is demonstrated that temporal coding requires significantly less neurons than instantaneous ratecoding. 2000 Mathematics Subject Classification: 82C32, 68T05, 68T10, 68T30, 92B20. 1998 ACM Computing Classification System: C.1.3, F.1.1, I.2.6, I.5.1. Keywords...
IntegrateandFire Neurons Driven by Correlated Stochastic Input
, 2002
"... Neurons are sensitive to correlations among synaptic inputs. However, analytical models that explicitly include correlations are hard to solve analytically, so their influence on a neuron’s response has been difficult to ascertain. To gain some intuition on this problem, we studied the firing times ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
Neurons are sensitive to correlations among synaptic inputs. However, analytical models that explicitly include correlations are hard to solve analytically, so their influence on a neuron’s response has been difficult to ascertain. To gain some intuition on this problem, we studied the firing times of two simple integrateandfire model neurons driven by a correlated binary variable that represents the total input current. Analytic expressions were obtained for the average firing rate and coefficient of variation (a measure of spiketrain variability) as functions of the mean, variance, and correlation time of the stochastic input. The results of computer simulations were in excellent agreement with these expressions. In these models, an increase in correlation time in general produces an increase in both the average firing rate and the variability of the output spike trains. However, the magnitude of the changes depends differentially on the relative values of the input mean and variance: the increase in firing rate is higher when the variance is large relative to the mean, whereas the increase in variability is higher when the variance is relatively small. In addition, the firing rate always tends to a finite limit value as the correlation time increases toward infinity, whereas the coefficient of variation typically diverges. These results suggest that temporal correlations may play a major role in determining the variability as well as the intensity of neuronal spike trains.