Results 1  10
of
12
Intrinsic Stabilization of Output Rates by SpikeBased Hebbian Learning
 Neural Computation
, 2001
"... We study analytically a model of longterm synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time dierences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quanti ed by means of a learn ..."
Abstract

Cited by 40 (12 self)
 Add to MetaCart
We study analytically a model of longterm synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time dierences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quanti ed by means of a learning window. We show that plasticity can lead to an intrinsic stabilization of the mean ring rate of the postsynaptic neuron. Subtractive normalization of the synaptic weights (summed over all presynaptic inputs converging on a postsynaptic neuron) follows if, in addition, the mean input rates and the mean input correlations are identical at all synapses. If the integral over the learning window is positive, ringrate stabilization requires a nonHebbian component, whereas such a component is not needed, if the integral of the learning window is negative. A negative integral corresponds to `antiHebbian' learning in a model with slowly varying ring rates. For spikebased learning, a strict distinction between Hebbian and `antiHebbian' rules is questionable since learning is driven by correlations on the time scale of the learning window. The correlations between presynaptic and postsynaptic ring are evaluated for a piecewiselinear Poisson model and for a noisy spiking neuron model with refractoriness. While a negative integral over the learning window leads to intrinsic rate stabilization, the positive part of the learning window picks up spatial and temporal correlations in the input.
Synchronization of the Neural Response to Noisy Periodic Synaptic Input in a Balanced Leaky IntegrateandFire Neuron with Reversal Potentials
 Neural Computation
, 1999
"... Neurons in which the level of excitation and inhibition are roughly balanced are shown to be very sensitive to the coherence of their synaptic input. The behavior of such balanced neurons with reversal potentials is analyzed both analytically and numerically using the leaky integrateandfire neural ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
Neurons in which the level of excitation and inhibition are roughly balanced are shown to be very sensitive to the coherence of their synaptic input. The behavior of such balanced neurons with reversal potentials is analyzed both analytically and numerically using the leaky integrateandfire neural model. The investigation uses the Gaussian approximation with synaptic inputs modeled as inhomogeneous Poisson processes. The results indicate that for balanced neurons with N synaptic inputs, it is only necessary for O( # N) of the synaptic inputs to have a periodicity in order that their spike outputs become phaselocked to this periodic signal.
How the Threshold of a Neuron Determines its Capacity for Coincidence Detection
, 1998
"... Coherent oscillatory activity of a population of neurons is thought to be a vital feature of temporal coding in the brain. We focus on the question of whether a single neuron can transform a spike code into a rate code. More precisely, how does a neuron vary its mean output firing rate, if its in ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Coherent oscillatory activity of a population of neurons is thought to be a vital feature of temporal coding in the brain. We focus on the question of whether a single neuron can transform a spike code into a rate code. More precisely, how does a neuron vary its mean output firing rate, if its input changes from random to coherent? We investigate the coincidence detection properties of an integrateandfire neuron in dependence upon internal parameters and input statistics. In particular, we show how coincidence detection depends on the membrane time constant and the threshold. Furthermore, we demonstrate that there is an optimal threshold for coincidence detection and that there is a broad range of nearoptimal threshold values. Finetuning is not necessary. Keywords: Coincidence detection, voltage threshold, coherent activity, temporal coding, rate coding, integrateandfire neuron Institut fur Theoretische Physik, Physik Department der TU Munchen, D85747 Garching, Germany...
Intrinsic Stabilization of Output Rates by SpikeTime Dependent Hebbian Learning
 Neural Computation
, 1999
"... Over a broad parameter regime, spiketime dependent learning leads to an intrinsic stabilization of the mean firing rate of the postsynaptic neuron. Subtractive normalization of the synaptic weights (summed over all presynaptic inputs converging on a postsynaptic neuron) follows if, in addition, the ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Over a broad parameter regime, spiketime dependent learning leads to an intrinsic stabilization of the mean firing rate of the postsynaptic neuron. Subtractive normalization of the synaptic weights (summed over all presynaptic inputs converging on a postsynaptic neuron) follows if, in addition, the mean input rates are identical at all synapses and correlations in the input are translation invariant. In a rate description, stabilization of the postsynaptic firing rate is most easily achieved by a negative correlation term in the learning rule, often called `antiHebbian' learning. For spikebased learning, a strict distinction between Hebbian and `antiHebbian' rules is no longer possible. Specifically, learning is driven by correlations on the time scale of the learning window which may be positive even though the integral over the learning window is negative. While the negative integral leads to intrinsic rate stabilization, the positive part of the learning window picks up temporal...
What's Different With Spiking Neurons?
"... In standard neural network models neurons are described in terms of mean firing rates, viz., an analog signal. Most real neurons, however, communicate by pulses, called action potentials or simply `spikes'. In this chapter the main di#erences between spike coding and rate coding are described. The i ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In standard neural network models neurons are described in terms of mean firing rates, viz., an analog signal. Most real neurons, however, communicate by pulses, called action potentials or simply `spikes'. In this chapter the main di#erences between spike coding and rate coding are described. The integrateandfire model is studied as a simple model of a spiking neuron. Fast transients, synchrony, and coincidence detection are discussed as examples where spike coding is relevant. A description by spikes rather than rates has implications for learning rules. We show the relation of a spiketime dependent learning rule to standard Hebbian learning. Finally, learning rule and temporal coding are illustrated using the example of a coincidence detecting neuron in the barn owl auditory system. Keywords: temporal coding, coincidence detection, spikes, spiking neurons, integrateand fire neurons, auditory system, Hebbian learning, spiketime dependent plasticity 1. SPIKES AND RATES In mos...
Modelling the neural response to speech: stochastic resonance and coding of vowellike stimuli
 University
, 2001
"... Abstract — We study the effect of noise upon the transmission of information about an input signal containing two frequencies in a leaky integrateandfire neural model. This work extends the results of earlier studies on stochastic resonance in neural models, and particularly in models of auditory ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract — We study the effect of noise upon the transmission of information about an input signal containing two frequencies in a leaky integrateandfire neural model. This work extends the results of earlier studies on stochastic resonance in neural models, and particularly in models of auditory processing. It is found that the temporal coding of two subthreshold formants can be enhanced by the presence of noise. This study provides an approximation to the response to vowel speech stimuli and the results have a bearing upon the possible effectiveness of incorporating noise in cochlear implant speech coding strategies. Keywords—Neural coding, stochastic resonance, Integrateandfire neurons, speech. I.
Quality Of Coincidence Detection And ITD Tuning: A Theoretical Framework
, 1999
"... Introduction Neurons have two basic operation modes: they can function either as integrators or as coincidence detectors. In the integration mode, the output rate changes as a function of the mean input rate and is independent of the temporal fine structure of input spike trains. Neurons act as coi ..."
Abstract
 Add to MetaCart
Introduction Neurons have two basic operation modes: they can function either as integrators or as coincidence detectors. In the integration mode, the output rate changes as a function of the mean input rate and is independent of the temporal fine structure of input spike trains. Neurons act as coincidence detectors, if their mean output firing rate does depend on the temporal structure of input spike trains. Coincidence detection implies that the output firing rate is higher if spikes arrive `coincidently' as compared to a random spike arrival. There is, however, no sharp boundary that separates the two basic operation modes. It is known that the membrane time constant 10 , the spike threshold and the number of input synapses 1 are relevant parameters. In this contribution we develop a framework that allows to assess how these parameters influence the quality of coincidence detection. We consider stochastic arrival of presyn
An InformationTheoretic Analysis of the Coding of a Periodic Synaptic Input by IntegrateandFire Neurons
, 2002
"... An expression for the mutual information between the phase of a periodic stimulus and the timing of the output spikes generated by the stimulus is given in the low output spikingrate regime. The mutual information is calculated for the leaky integrateandfire neuron in the Gaussian approximation. ..."
Abstract
 Add to MetaCart
An expression for the mutual information between the phase of a periodic stimulus and the timing of the output spikes generated by the stimulus is given in the low output spikingrate regime. The mutual information is calculated for the leaky integrateandfire neuron in the Gaussian approximation. The mutual information is found to be e#ectively described as a function of the synchronization of the output spikes and their average spikingrate. The results in the subthreshold input regime shed light upon the role of stochastic resonance in such models.
Fast Propagation of Firing Rates through Layered Networks of
 J. Neurosci
, 2002
"... this paper, we study information transmission in multilayer architectures in which computation is distributed and activity needs to propagate through many layers. We show that, in the presence of a noisy background current, firing rates propagate rapidly and linearly through a deeply layered network ..."
Abstract
 Add to MetaCart
this paper, we study information transmission in multilayer architectures in which computation is distributed and activity needs to propagate through many layers. We show that, in the presence of a noisy background current, firing rates propagate rapidly and linearly through a deeply layered network. The noise is essential but does not lead to deterioration of the propagated activity. The efficiency of the rate coding is improved by combining it with a population code. We propose that the resulting signal coding is a realistic framework for sensory computation
LETTER Communicated by Laurence Abbott Intrinsic Stabilization of Output Rates by SpikeBased Hebbian Learning
"... We study analytically a model of longterm synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time differences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quanti�ed by means of a lear ..."
Abstract
 Add to MetaCart
We study analytically a model of longterm synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time differences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quanti�ed by means of a learning window. We show that plasticity can lead to an intrinsic stabilization of the mean �ring rate of the postsynaptic neuron. Subtractive normalization of the synaptic weights (summed over all presynaptic inputs converging on a postsynaptic neuron) follows if, in addition, the mean input rates and the mean input correlations are identical at all synapses. If the integral over the learning window is positive, �ringrate stabilization requires a nonHebbian component, whereas such a component is not needed if the integral of the learning window is negative. A negative integral corresponds to antiHebbian learning in a model with slowly varying �ring rates. For spikebased learning, a strict distinction between Hebbian and antiHebbian rules is questionable since learning is driven by correlations on the timescale of the learning window. The correlations between presynaptic and postsynaptic �ring are evaluated for a piecewiselinear Poisson model and for a noisy spiking neuron model with refractoriness. While a negative integral over the learning window leads to intrinsic rate stabilization, the positive part of the learning window picks up spatial and temporal correlations in the input.