## Globally Convergent Algorithms With Local Learning Rates

Venue: | IEEE Tr. Neural Networks |

Citations: | 5 - 3 self |

### BibTeX

@ARTICLE{Magoulas_globallyconvergent,

author = {George D. Magoulas and Vassilis P. Plagianakos and Michael N. Vrahatis},

title = {Globally Convergent Algorithms With Local Learning Rates},

journal = {IEEE Tr. Neural Networks},

year = {},

volume = {13},

pages = {774--779}

}

### OpenURL

### Abstract

In this paper, a new generalized theoretical result is presented that underpins the development of globally convergent first-order batch training algorithms which employ local learning rates. This result allows us to equip algorithms of this class with a strategy for adapting the overall direction of search to a descent one. In this way, a decrease of the batch-error measure at each training iteration is ensured, and convergence of the sequence of weight iterates to a local minimizer of the batch error function is obtained from remote initial weights. The effectiveness of the theoretical result is illustrated in three application examples by comparing two well-known training algorithms with local learning rates to their globally convergent modifications.