## A General Feed-Forward Algorithm for Gradient Descent in Connectionist Networks (1990)

Citations: | 6 - 4 self |

### BibTeX

@MISC{Thrun90ageneral,

author = {Sebastian Thrun and Frank Smieja},

title = {A General Feed-Forward Algorithm for Gradient Descent in Connectionist Networks},

year = {1990}

}

### OpenURL

### Abstract

An extended feed-forward algorithm for recurrent connectionist networks is presented. This algorithm, which works locally in time, is derived both for discrete-in-time networks and for continuous networks. Several standard gradient descent algorithms for connectionist networks (e.g. [48], [30], [28] [15], [34]), especially the backpropagation algorithm [36], are mathematically derived as a special case of this general algorithm. The learning algorithm presented in this paper is a superset of gradient descent learning algorithms for multilayer networks, recurrent networks and time-delay networks that allows any combinations of their components. In addition, the paper presents feed-forward approximation procedures for initial activations and external input values. The former one is used for optimizing starting values of the so-called context nodes, the latter one turned out to be very useful for finding spurious input attractors of a trained connectionist network. Finally, we compare tim...