## Recurrent Neural Networks With Small Weights Implement Definite Memory Machines (2003)

### Cached

### Download Links

- [www.informatik.uni-osnabrueck.de]
- [www.cs.bham.ac.uk]
- [www.informatik.uni-osnabrueck.de]
- [www.ncrg.aston.ac.uk]
- [www2.in.tu-clausthal.de]
- DBLP

### Other Repositories/Bibliography

Venue: | NEURAL COMPUTATION |

Citations: | 21 - 5 self |

### BibTeX

@ARTICLE{Hammer03recurrentneural,

author = {Barbara Hammer and Peter Tino},

title = {Recurrent Neural Networks With Small Weights Implement Definite Memory Machines},

journal = {NEURAL COMPUTATION},

year = {2003},

volume = {15},

pages = {1897--1929}

}

### Years of Citing Articles

### OpenURL

### Abstract

Recent experimental studies indicate that recurrent neural networks initialized with `small' weights are inherently biased towards definite memory machines (Tino, Cernansky, Benuskova, 2002a; Tino, Cernansky, Benuskova, 2002b). This paper establishes a theoretical counterpart: transition function of recurrent network with small weights and `squashing ' activation function is a contraction. We prove that recurrent networks with contractive transition function can be approximated arbitrarily well on input sequences of unbounded length by a definite mem-