Results 1 
5 of
5
A Natural Law of Succession
, 1995
"... Consider the following problem. You are given an alphabet of k distinct symbols and are told that the i th symbol occurred exactly ni times in the past. On the basis of this information alone, you must now estimate the conditional probability that the next symbol will be i. In this report, we presen ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Consider the following problem. You are given an alphabet of k distinct symbols and are told that the i th symbol occurred exactly ni times in the past. On the basis of this information alone, you must now estimate the conditional probability that the next symbol will be i. In this report, we present a new solution to this fundamental problem in statistics and demonstrate that our solution outperforms standard approaches, both in theory and in practice.
The minimum average code for finite memoryless monotone sources
 in Proc., IEEE Information Theory Workshop
, 2002
"... Abstract—The problem of selecting a code for finite monotone sources with x symbols is considered. The selection criterion is based on minimizing the average redundancy (called Minave criterion) instead of its maximum (i.e., Minimax criterion). The average probability distribution € x, whose associa ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract—The problem of selecting a code for finite monotone sources with x symbols is considered. The selection criterion is based on minimizing the average redundancy (called Minave criterion) instead of its maximum (i.e., Minimax criterion). The average probability distribution € x, whose associated Huffman code has the minimum average redundancy, is derived. The entropy of the average distribution (i.e.,
The asymptotic minimax risk for the estimation of constrained binomial and multinomial probabilities. Sankhya
, 2004
"... In this paper we present a direct and simple approach to obtain bounds on the asymptotic minimax risk for the estimation of constrained binomial and multinomial proportions. Quadratic, normalized quadratic and entropy loss are considered and it is demonstrated that in all cases linear estimators are ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In this paper we present a direct and simple approach to obtain bounds on the asymptotic minimax risk for the estimation of constrained binomial and multinomial proportions. Quadratic, normalized quadratic and entropy loss are considered and it is demonstrated that in all cases linear estimators are asymptotically minimax optimal. For the quadratic loss function the asymptotic minimax risk does not change unless a neighborhood of the point 1/2 is excluded by the restrictions on the parameter space. For the two other loss functions the asymptotic behavior of the minimax risk is not changed by such additional knowledge about the location of the unknown probability. The results are also extended to the problem of minimax estimation of a vector of constrained multinomial probabilities. AMS (2000) subject classification. 62C20.
Bernstein Polynomials and Learning Theory
 J
, 2004
"... When learning processes depend on samples but not on the order of the information in the sample, then the Bernoulli distribution is relevant and Bernstein polynomials enter into the analysis. We derive estimates of the approximation of the entropy function x log x that are sharper than the bounds fr ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
When learning processes depend on samples but not on the order of the information in the sample, then the Bernoulli distribution is relevant and Bernstein polynomials enter into the analysis. We derive estimates of the approximation of the entropy function x log x that are sharper than the bounds from Voronovskaja's theorem. In this way we get the correct asymptotics for the KullbackLeibler distance for an encoding problem.