## The Role of Constraints in Hebbian Learning (1994)

### Cached

### Download Links

Venue: | NEURAL COMPUTATION |

Citations: | 62 - 4 self |

### BibTeX

@ARTICLE{Miller94therole,

author = {Kenneth D. Miller and David J. C. MacKay},

title = {The Role of Constraints in Hebbian Learning},

journal = {NEURAL COMPUTATION},

year = {1994},

volume = {6},

pages = {100--126}

}

### OpenURL

### Abstract

Models of unsupervised correlation-based (Hebbian) synaptic plasticity are typically unstable: either all synapses grow until each reaches the maximum allowed strength, or all synapses decay to zero strength. A common method of avoiding these outcomes is to use a constraint that conserves or limits the total synaptic strength over a cell. We study the dynamical effects of such constraints. Two methods of enforcing a constraint are distinguished, multiplicative and subtractive. For otherwise linear learning rules, multiplicative enforcement of a constraint results in dynamics that converge to the principal eigenvector of the operator determining unconstrained synaptic development. Subtractive enforcement, in contrast, typically leads to a final state in which almost all synaptic strengths reach either the maximum or minimum allowed value. This final state is often dominated by weight configurations other than the principal eigenvector of the unconstrained operator. Multiplica...