Information-theoretic asymptotics of Bayes methods (1990)

Cached

Download Links

by Bertrand Clarke , Andrew , R. Barron
Venue:IEEE Transactions on Information Theory
Citations:107 - 10 self

Documents Related by Co-Citation

8569 Elements of Information Theory – T M Cover, J A Thomas - 1991
47 A strong version of the redundancy-capacity theorem of universal coding – Neri Merhav, Meir Feder - 1995
286 Universal coding, information, prediction, and estimation – J Rissanen - 1984
276 Fisher information and stochastic complexity – J J Rissanen - 1996
248 Stochastic complexity and modeling – J Rissanen - 1986
204 Minimum complexity density estimation – A Barron, T Cover - 1991
25 A source matching approach to finding minimax codes – L D Davisson, A Leon-Garcia - 1980
15 General bounds on the mutual information between a parameter and n conditionally independent observations – David Haussler - 1995
39 Mutual Information, Metric Entropy, and Cumulative Relative Entropy Risk – David Haussler, Manfred Opper - 1996
45 Density estimation by stochastic complexity – J Rissanen, T P Speed, B Yu - 1992
495 Stochastic complexity – J Rissanen - 1987
311 An Information Measure for Classification – C S Wallace, D M Boulton
59 Jeffrey’s prior is asymptotically least favorable under entropy risk – B Clarke, A Barron - 1994
1209 Information theory and an extension of the maximum likelihood principle – H Akaike - 1973
1689 An introduction to Kolmogorov Complexity and its Applications: Preface to the First Edition – Ming Li, Paul Vitanyi - 1997
91 Universal noiseless coding – L D Davisson - 1973
1151 Modeling by shortest data description – J Rissanen - 1978
11 On the information in a sample about a parameter – I Ibragimov, R Hasminsky - 1973
108 Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension – David Haussler, Michael Kearns, Robert Schapire - 1994