## Discovering Neural Nets With Low Kolmogorov Complexity And High Generalization Capability (1997)

Venue: | Neural Networks |

Citations: | 49 - 30 self |

### BibTeX

@ARTICLE{Schmidhuber97discoveringneural,

author = {Jürgen Schmidhuber},

title = {Discovering Neural Nets With Low Kolmogorov Complexity And High Generalization Capability},

journal = {Neural Networks},

year = {1997},

volume = {10},

pages = {10--5}

}

### OpenURL

### Abstract

Many neural net learning algorithms aim at finding "simple" nets to explain training data. The expectation is: the "simpler" the networks, the better the generalization on test data (! Occam's razor). Previous implementations, however, use measures for "simplicity" that lack the power, universality and elegance of those based on Kolmogorov complexity and Solomonoff's algorithmic probability. Likewise, most previous approaches (especially those of the "Bayesian" kind) suffer from the problem of choosing appropriate priors. This paper addresses both issues. It first reviews some basic concepts of algorithmic complexity theory relevant to machine learning, and how the Solomonoff-Levin distribution (or universal prior) deals with the prior problem. The universal prior leads to a probabilistic method for finding "algorithmically simple" problem solutions with high generalization capability. The method is based on Levin complexity (a time-bounded generalization of Kolmogorov comple...