Results 1  10
of
13
Generalised Pinsker inequalities.
 In COLT,
, 2009
"... Abstract We generalise the classical Pinsker inequality which relates variational divergence to KullbackLiebler divergence in two ways: we consider arbitrary f divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then dev ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
Abstract We generalise the classical Pinsker inequality which relates variational divergence to KullbackLiebler divergence in two ways: we consider arbitrary f divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then develop a best possible inequality for this doubly generalised situation. Specialising our result to the classical case provides a new and tight explicit bound relating KL to variational divergence (solving a problem posed by Vajda some 40 years ago). The solution relies on exploiting a connection between divergences and the Bayes risk of a learning problem via an integral representation.
THE USE OF CSISZÁR'S DIVERGENCE TO ASSESS DISSIMILARITIES OF INCOME DISTRIBUTIONS OF EU COUNTRIES
"... Abstract: Income distributions can be described by measures of central tendency, dispersion, skewness, kurtosis or by indexes of polarization. In numerous studies, Gini coefficient and Lorenz curve have been used to investigate inequality of incomes. Income distributions can also be analysed in comp ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: Income distributions can be described by measures of central tendency, dispersion, skewness, kurtosis or by indexes of polarization. In numerous studies, Gini coefficient and Lorenz curve have been used to investigate inequality of incomes. Income distributions can also be analysed in comparison to one another. In the article two measures belonging to Csiszár's divergence class have been used to identify the degree of differentiation of income distributions among the EU countries in 2005 and 2012. Similar and dissimilar countries with respect to distribution of income have been identified and the change of divergence of EU countries income distributions between 2005 and 2012 has been assessed. European Union Statistics on Income and Living Conditions (EUSILC) dataset has been used.
− Geometric mean; − Arithmetic mean;
, 2005
"... Abstract. Arithmetic, geometric and harmonic means are the three classical means famous in the literature. Another mean such as squareroot mean is also known. In this paper, we have constructed divergence measures based on nonnegative differences among these means, and established an interesting in ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Arithmetic, geometric and harmonic means are the three classical means famous in the literature. Another mean such as squareroot mean is also known. In this paper, we have constructed divergence measures based on nonnegative differences among these means, and established an interesting inequality by use of properties of Csiszár’s fdivergence. An improvement over this inequality is also presented. Comparison of new mean divergence measures with classical divergence measures such as Jdivergence [10, 11], JensenShannon difference divergence measure [3, 15] and arithmeticgeometric mean divergence measure [17] are also established.
Let RELATIVE DIVERGENCE MEASURES AND INFORMATION INEQUALITIES
, 2005
"... Abstract. There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are KullbackLeiber’s [17] relative information and Jeffreys [16] Jdivergence, Information radius or Jensen difference divergence measure due to Sibs ..."
Abstract
 Add to MetaCart
Abstract. There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are KullbackLeiber’s [17] relative information and Jeffreys [16] Jdivergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied another kind of divergence measure based on arithmetic and geometric means. These three divergence measures bear a good relationship among each other. But there are another measures arising due to Jdivergence, JSdivergence and AGdivergence. These measures we call here relative divergence measures or nonsymmetric divergence measures. Here our aim is to obtain bounds on symmetric and nonsymmetric divergence measures in terms of relative information of type s using properties of Csiszár’s fdivergence.
Let
, 2005
"... There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are KullbackLeibler [13] relative information and Jeffreys [12] J divergence. Sibson [17] JensenShannon divergence has also found its applications in the lit ..."
Abstract
 Add to MetaCart
(Show Context)
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are KullbackLeibler [13] relative information and Jeffreys [12] J divergence. Sibson [17] JensenShannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means. The measures like harmonic mean divergence and triangular discrimination [6] are also known in the literature. Recently, Dragomir et al. [10] also studies a new measure similar to Jdivergence, we call here the relative Jdivergence. Another measures arising due to JensenShannon divergence is also studied by Lin [15]. Here we call it relative JensenShannon divergence. Relative arithmeticgeometric divergence (Taneja [24]) is also studied here. All these measures can be written as particular cases of Csiszár fdivergence. By putting some conditions on the probability distribution, the aim here is to obtain bounds among the measures. Key words: Jdivergence; JensenShannon divergence; Arithmeticgeometric divergence;
Sequences of Inequalities among Differences of Gini Means and Divergence Measures
"... ar ..."
(Show Context)
A brief overview of information measures on classical, discrete probability distributions. Contents
, 2015
"... Shannon entropy............ 2 Joint entropy.............. 2 ..."