## Biologically Plausible Error-driven Learning using Local Activation Differences: The Generalized Recirculation Algorithm (1996)

### Cached

### Download Links

Venue: | NEURAL COMPUTATION |

Citations: | 95 - 11 self |

### BibTeX

@ARTICLE{O'Reilly96biologicallyplausible,

author = {Randall C. O'Reilly},

title = {Biologically Plausible Error-driven Learning using Local Activation Differences: The Generalized Recirculation Algorithm},

journal = {NEURAL COMPUTATION},

year = {1996},

volume = {8},

number = {5},

pages = {895--938}

}

### Years of Citing Articles

### OpenURL

### Abstract

The error backpropagation learning algorithm (BP) is generally considered biologically implausible because it does not use locally available, activation-based variables. A version of BP that can be computed locally using bi-directional activation recirculation (Hinton & McClelland, 1988) instead of backpropagated error derivatives is more biologically plausible. This paper presents a generalized version of the recirculation algorithm (GeneRec), which overcomes several limitations of the earlier algorithm by using a generic recurrent network with sigmoidal units that can learn arbitrary input/output mappings. However, the contrastiveHebbian learning algorithm (CHL, a.k.a. DBM or mean field learning) also uses local variables to perform error-driven learning in a sigmoidal recurrent network. CHL was derived in a stochastic framework (the Boltzmann machine), but has been extended to the deterministic case in various ways, all of which rely on problematic approximationsand assumptions, le...