Results 1  10
of
51
Mathematical Formulations of Hebbian Learning
 Biol Cybern
, 2002
"... Several formulations of correlationbased Hebbian learning are reviewed. On the presynaptic side, activity is described either by a firing rate or by presynaptic spike arrival. The state of the postsynaptic neuron can be described by its membrane potential, its firing rate, or the timing of backprop ..."
Abstract

Cited by 77 (7 self)
 Add to MetaCart
Several formulations of correlationbased Hebbian learning are reviewed. On the presynaptic side, activity is described either by a firing rate or by presynaptic spike arrival. The state of the postsynaptic neuron can be described by its membrane potential, its firing rate, or the timing of backpropagating action potentials (BPAPs). It is shown that all of the above formulations can be derived from the point of view of an expansion. In the absence of BPAPs potentials, it is natural to correlate presynaptic spikes with the postsynaptic membrane potential. Time windows of spike time dependent plasticity arise naturally, if the timing of postsynaptic spikes is available at the site of the synapse as it is the case in the presence of BPAPs. With an appropriate choice of parameters, Hebbian synaptic plasticity has intrinsic normalization properties that stabilizes postsynaptic firing rates and leads to subtractive weight normalization.
Generalization in Interactive Networks: The Benefits of Inhibitory Competition and Hebbian Learning
 Neural Computation
, 2001
"... Computational models in cognitive neuroscience should ideally use biological properties and powerful computational principles to produce behavior consistent with psychological findings. Errordriven backpropagation is computationally powerful, and has proven useful for modeling a range of psycholo ..."
Abstract

Cited by 45 (6 self)
 Add to MetaCart
Computational models in cognitive neuroscience should ideally use biological properties and powerful computational principles to produce behavior consistent with psychological findings. Errordriven backpropagation is computationally powerful, and has proven useful for modeling a range of psychological data, but is not biologically plausible. Several approaches to implementing backpropagation in a biologically plausible fashion converge on the idea of using bidirectional activation propagation in interactive networks to convey error signals. This paper demonstrates two main points about these errordriven interactive networks: (a) they generalize poorly due to attractor dynamics that interfere with the network's ability to systematically produce novel combinatorial representations in response to novel inputs; and (b) this generalization problem can be remedied by adding two widely used mechanistic principles, inhibitory competition and Hebbian learning, that can be independent...
Intrinsic Stabilization of Output Rates by SpikeBased Hebbian Learning
 Neural Computation
, 2001
"... We study analytically a model of longterm synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time dierences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quanti ed by means of a learn ..."
Abstract

Cited by 40 (12 self)
 Add to MetaCart
We study analytically a model of longterm synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time dierences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quanti ed by means of a learning window. We show that plasticity can lead to an intrinsic stabilization of the mean ring rate of the postsynaptic neuron. Subtractive normalization of the synaptic weights (summed over all presynaptic inputs converging on a postsynaptic neuron) follows if, in addition, the mean input rates and the mean input correlations are identical at all synapses. If the integral over the learning window is positive, ringrate stabilization requires a nonHebbian component, whereas such a component is not needed, if the integral of the learning window is negative. A negative integral corresponds to `antiHebbian' learning in a model with slowly varying ring rates. For spikebased learning, a strict distinction between Hebbian and `antiHebbian' rules is questionable since learning is driven by correlations on the time scale of the learning window. The correlations between presynaptic and postsynaptic ring are evaluated for a piecewiselinear Poisson model and for a noisy spiking neuron model with refractoriness. While a negative integral over the learning window leads to intrinsic rate stabilization, the positive part of the learning window picks up spatial and temporal correlations in the input.
Factor analysis using deltarule wakesleep learning
 Neural Computation
, 1997
"... We describe a linear network that models correlations between realvalued visible variables using one or more realvalued hidden variables — a factor analysis model. This model can be seen as a linear version of the “Helmholtz machine”, and its parameters can be learned using the “wakesleep ” metho ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
We describe a linear network that models correlations between realvalued visible variables using one or more realvalued hidden variables — a factor analysis model. This model can be seen as a linear version of the “Helmholtz machine”, and its parameters can be learned using the “wakesleep ” method, in which learning of the primary “generative” model is assisted by a “recognition ” model, whose role is to fill in the values of hidden variables based on the values of visible variables. The generative and recognition models are jointly learned in “wake ” and “sleep ” phases, using just the delta rule. This learning procedure is comparable in simplicity to Oja’s version of Hebbian learning, which produces a somewhat different representation of correlations in terms of principal components. We argue that the simplicity of wakesleep learning makes factor analysis a plausible alternative to Hebbian learning as a model of activitydependent cortical plasticity. 1
Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns
, 1996
"... The formation of ocular dominance and orientation columns in the mammalian visual cortex is briefly reviewed. Correlationbased models for their development are then discussed, beginning with the models of Von der Malsburg. For the case of semilinear models, model behavior is well understood: c ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
The formation of ocular dominance and orientation columns in the mammalian visual cortex is briefly reviewed. Correlationbased models for their development are then discussed, beginning with the models of Von der Malsburg. For the case of semilinear models, model behavior is well understood: correlations determine receptive field structure, intracortical interactions determine projective field structure, and the "knitting together" of the two determines the cortical map. This provides a basis for simple but powerful models of ocular dominance and orientation column formation: ocular dominance columns form through a correlationbased competition between lefteye and righteye inputs, while orientation columns can form through a competition between ONcenter and OFFcenter inputs. These models account well for receptive field structure, but are not completely adequate to account for the details of cortical map structure. Alternative approaches to map structure, including the...
An Egalitarian Network Model for the Emergence of Simple and Complex Cells in Visual Cortex
 Proc Natl Acad Sci USA 101: 366–371
, 2004
"... We explain how Simple and Complex cells arise in a largescale neuronal network model of the primary visual cortex of the macaque. Our model consists of over 16,000 integrate andfire, conductancebased point neurons, representing the cells in a small, i mm 2 patch of an input layer of the primary ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
We explain how Simple and Complex cells arise in a largescale neuronal network model of the primary visual cortex of the macaque. Our model consists of over 16,000 integrate andfire, conductancebased point neurons, representing the cells in a small, i mm 2 patch of an input layer of the primary visual cortex. In the model the local connections are isotropic and nonspecific, and convergent input from the lateral geniculate nucleus confers cortical cells with orientation and spatial phase preference. The balance between lateral connections and LGN drive determines whether individual neurons in this recurrent circuit are Simple or Complex. The model reproduces qualitatively the experimentally observed distributions of both extracellular and intracellular measures of Simple and Complex response.
The Joint Development of Orientation and Ocular Dominance: Role of Constraints
, 1997
"... this paper. ..."
Neuronal Regulation: A Mechanism For Synaptic Pruning During Brain Maturation
 Neural Computation
, 1998
"... Human and animal studies show that mammalian brains undergoes massive synaptic pruning during childhood, removing about half of the synapses until puberty. We have previously shown that maintaining the network performance while synapses are deleted, requires that synapses are properly modified and p ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Human and animal studies show that mammalian brains undergoes massive synaptic pruning during childhood, removing about half of the synapses until puberty. We have previously shown that maintaining the network performance while synapses are deleted, requires that synapses are properly modified and pruned, removing the weaker synapses. We now show that neuronal regulation  a mechanism recently observed to maintain the average neuronal input field of a post synaptic neuron  results in a weightdependent synaptic modification. Under the correct range of the degradation dimension and synaptic upper bound, neuronal regulation removes the weaker synapses and judiciously modifies the remaining synapses. By deriving optimal synaptic modification functions in an excitatoryinhibitory network we prove that neuronal regulation implements near optimal synaptic modification, and maintains the performance of a network undergoing massive synaptic pruning. These findings support the possibility that ...
Development of spatiotemporal receptive fields of simple cells: II. Simulation and analysis
, 1997
"... . In part I of this article a correlation based model for the developmental process of spatiotemporal receptive fields has been introduced. In this model the development is described as an activitydependent competition between four types of input from the lateral geniculate nucleus onto a cortical ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
. In part I of this article a correlation based model for the developmental process of spatiotemporal receptive fields has been introduced. In this model the development is described as an activitydependent competition between four types of input from the lateral geniculate nucleus onto a cortical cell, viz. nonlagged ON and OFF and lagged ON and OFF inputs. In the present paper simulation results and a first analysis are presented for this model. We study the developmental process both before and after eyeopening and compare the results with experimental data from reverse correlation measurements. The outcome of the developmental process is determined mainly by the spatial and the temporal correlations between the different inputs. In particular, if the mean correlation between nonlagged and lagged inputs is weak, receptive fields with a widely varying degree of direction selectivity emerge. However, spatiotemporal receptive fields may show rotation of their preferred orientation ...