## Statistical Ideas for Selecting Network Architectures (1995)

Venue: | Invited Presentation, Neural Information Processing Systems 8 |

Citations: | 18 - 3 self |

### BibTeX

@INPROCEEDINGS{Ripley95statisticalideas,

author = {B. D. Ripley},

title = {Statistical Ideas for Selecting Network Architectures},

booktitle = {Invited Presentation, Neural Information Processing Systems 8},

year = {1995},

pages = {183--190},

publisher = {Springer}

}

### OpenURL

### Abstract

Choosing the architecture of a neural network is one of the most important problems in making neural networks practically useful, but accounts of applications usually sweep these details under the carpet. How many hidden units are needed? Should weight decay be used, and if so how much? What type of output units should be chosen? And so on. We address these issues within the framework of statistical theory for model choice, which provides a number of workable approximate answers. This paper is principally concerned with architecture selection issues for feed-forward neural networks (also known as multi-layer perceptrons). Many of the same issues arise in selecting radial basis function networks, recurrent networks and more widely. These problems occur in a much wider context within statistics, and applied statisticians have been selecting and combining models for decades. Two recent discussions are [4, 5]. References [3, 20, 21, 22] discuss neural networks from a statistical perspecti...