Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity (1998)

Cached

Download Links

by Paul Vitányi , Ming Li
Venue:IEEE Transactions on Information Theory
Citations:67 - 7 self

Active Bibliography

Simplicity, Information, Kolmogorov Complexity, and Prediction – Paul Vitányi, Ming Li - 1998
10 On Prediction by Data Compression – Paul Vitányi, Ming Li - 1997
58 A tutorial introduction to the minimum description length principle – Peter Grünwald
20 Applying MDL to Learning Best Model Granularity – Qiong Gao, Ming Li, Paul Vitányi - 1994
1 Introducing the Minimum Description Length Principle – Peter Grünwald
5 Open problems in universal induction & intelligence – Marcus Hutter - 2009
1 Schwarz, Wallace, and Rissanen: Intertwining Themes in Theories of Model Selection – Aaron D. Lanterman - 2000
30 Predictability, Complexity, and Learning – William Bialek, Ilya Nemenman, Naftali Tishby - 2001
1 Luckiness and Regret in Minimum Description Length Inference – Steven De Rooij, Peter D. Grünwald - 2009
104 Minimum Message Length and Kolmogorov Complexity – C. S. Wallace, D. L. Dowe - 1999
Master Thesis – Lanterman - 91
3 Information theory and learning: a physical approach – Ilya Mark Nemenman - 2000
Potential Properties of Turing Machines – José Hernández-orallo, David L. Dowe - 2012
22 On Universal Prediction and Bayesian Confirmation – Marcus Hutter - 2007
4 Computational Machine Learning in Theory and Praxis – Ming Li, Paul Vitányi - 1995
5 Does Algorithmic Probability Solve the Problem of Induction? – Ray Solomonoff - 2001
32 Kolmogorov’s structure functions and model selection – Nikolai Vereshchagin, Paul Vitányi
93 The Dimensions of Individual Strings and Sequences – Jack H. Lutz - 2003
12 Computational depth and reducibility – David W. Juedes, James I. Lathrop, Jack H. Lutz - 1994