## Backpropagation-Decorrelation: online recurrent learning with O(N) complexity

Citations: | 31 - 3 self |

### BibTeX

@MISC{Steil_backpropagation-decorrelation:online,

author = {Jochen J. Steil},

title = { Backpropagation-Decorrelation: online recurrent learning with O(N) complexity},

year = {}

}

### Years of Citing Articles

### OpenURL

### Abstract

We introduce a new learning rule for fully recurrent neural networks which we call Backpropagation-Decorrelation rule (BPDC). It combines important principles: one-step backpropagation of errors and the usage of temporal memory in the network dynamics by means of decorrelation of activations. The BPDC rule is derived and theoretically justified from regarding learning as a constraint optimization problem and applies uniformly in discrete and continuous time. It is very easy to implement, and has a minimal complexity of 2N multiplications per time-step in the single output case. Nevertheless we obtain fast tracking and excellent performance in some benchmark problems including the Mackey-Glass time-series.