## Subspace information criterion for model selection (2001)

### Cached

### Download Links

- [sugiyama-www.cs.titech.ac.jp]
- [ogawa-www.cs.titech.ac.jp]
- [ogawa-www.cs.titech.ac.jp]
- [ftp.cs.titech.ac.jp]
- DBLP

### Other Repositories/Bibliography

Venue: | Neural Computation |

Citations: | 43 - 28 self |

### BibTeX

@ARTICLE{Sugiyama01subspaceinformation,

author = {Masashi Sugiyama and Hidemitsu Ogawa},

title = {Subspace information criterion for model selection},

journal = {Neural Computation},

year = {2001},

volume = {13},

pages = {2001}

}

### OpenURL

### Abstract

The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion for model selection called the subspace information criterion (SIC), which is a generalization of Mallows ’ C L. It is assumed that the learning target function belongs to a specified functional Hilbert space and the generalization error is defined as the Hilbert space squared norm of the difference between the learning result function and target function. SIC gives an unbiased estimate of the generalization error so defined. SIC assumes the availability of an unbiased estimate of the target function and the noise covariance matrix, which are generally unknown. A practical calculation method of SIC for least mean squares learning is provided under the assumption that the dimension of the Hilbert space is less than the number of training examples. Finally, computer simulations in two examples show that SIC works well even when the number of training examples is small.