Information-theoretic asymptotics of Bayes methods (1990)

Cached

Download Links

by Bertrand Clarke , Andrew , R. Barron
Venue:IEEE Transactions on Information Theory
Citations:107 - 10 self

Documents Related by Co-Citation

8609 Elements of information theory – T Cover, J Thomas - 1991
45 A strong version of the redundancy-capacity theorem of universal coding – Neri Merhav, Meir Feder - 1995
286 Universal coding, information, prediction, and estimation – J Rissanen - 1984
249 Stochastic complexity and modeling – Jorma Rissanen - 1986
275 Fisher information and stochastic complexity – J Rissanen - 1996
206 Minimum Complexity Density Estimation – A Barron, T Cover - 1991
25 A source matching approach to finding minimax codes – L D Davisson, A L Garcia - 1980
14 General bounds on the mutual information between a parameter and n conditionally independent observations – David Haussler - 1995
40 Mutual Information, Metric Entropy, and Cumulative Relative Entropy Risk – David Haussler, Manfred Opper - 1996
46 Density estimation by stochastic complexity – Jorma Rissanen, Terry P Speed, Bin Yu - 1992
499 Stochastic Complexity – J RISSANEN - 1989
1243 Information theory and an extension of the maximum likelihood principle – H Akaike - 1973
60 Jeffrey’s prior is asymptotically least favorable under entropy risk – B Clarke, A Barron - 1994
313 An Information Measure for Classification – C S Wallace, D M Boulton - 1968
1690 An introduction to Kolmogorov Complexity and its Applications: Preface to the First Edition – Ming Li, Paul Vitanyi - 1997
90 Universal noiseless coding – L D Davisson - 1973
11 On the information in a sample about a parameter – I Ibragimov, R Hasminsky - 1973
109 Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension – David Haussler, Michael Kearns, Robert Schapire - 1994
1165 Modeling by shortest data description – J Rissanen - 1978