On the Measurement of Randomness (Uncertainty): A More Informative Entropy
Abstract
:1. Introduction
2. Conditions for Valid Difference Comparisons
3. The New Entropy
3.1. Derivation of
- (1)
- As a matter of convenience and as used throughout the rest of the paper, all individual probabilities will be considered ordered such that:
- (2)
- Due to the constraint that , there is no loss of generality or information by focusing on .
- (3)
- Instead of considering or a weighted mean or sum of for some function f of the individual ’s, one could consider the sum of the means of all pairs of the . Since an entropy measure needs to be zero-indifferent (expansible), i.e., unaffected by the addition of events with zero probabilities (e.g., [14] (Chapter 1)), a logical choice of pairwise means would be the geometric means for all i, j = 2, …, n (since obviously ). Therefore, the measure consisting of the means , including those for i = j, can be expressed as:
3.2. Properties of
- Note 1: If a measure (function) M is strictly concave so that, instead of Equation (23), the inequality is strict for all , then the condition in Equation (11) cannot be met. The H in Equation (1) is one such measure.
- Note 2: The extremal values for a measure of randomness or uncertainty is also a logical requirement for valid difference comparisons. As a particular case of the proportional difference comparisons in Equation (2c), and for any integer m < n:
- Note 3: For the binary case of n = 2, , which equals H(0.5, 0.5) in Equation (1) if the base-2 logarithm is used. In fact, H(0.5, 0.5) = 1 is an axiom or required property, the normalization axiom, frequently used in information theory to justify the use of the base-2 logarithm in Equation (1) and bits as the unit of measurement [14] (Chapter 1). The binary entropy or:
4. Generalization of
5. Comparative Analysis
5.1. Why the Preference for
5.2. Comparative Weights on
5.3. Inconsistent Orderings
6. Discussion
7. Statistical Inference about
7.1. Bias
7.2. Confidence Interval Construction
8. Conclusions
Acknowledgments
Conflicts of Interest
References
- Von Boltzmann, L. Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen. In Sitzungsberichte der Kaiserliche Akademie der Wissenschaften, II Abteil; (Vol. 66, Pt. 2); K.-K. Hof- und Staatsdruckerei in Commission bei C. Gerold’s Sohn: Wien, Austria, 1872; pp. 275–370. (In German) [Google Scholar]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
- Klir, G.J. Uncertainty and Information: Foundations of a Generalized Information Theory; Wiley: Hoboken, NJ, USA, 2006. [Google Scholar]
- Ruelle, D. Chance and Chaos; Princeton University Press: Princeton, NJ, USA, 1991. [Google Scholar]
- Han, T.S.; Kobayashi, K. Mathematics of Information and Coding; American Mathematical Society: Providence, RI, USA, 2002. [Google Scholar]
- Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949. [Google Scholar]
- Arndt, C. Information Measures: Information and Its Description in Science and Engineering; Springer: Berlin/Heidelberg, Germany, 2004. [Google Scholar]
- Kapur, J.N. Measures of Information and Their Applications; Wiley: New Delhi, India, 1994. [Google Scholar]
- Kvålseth, T.O. Entropy. In International Encyclopedia of Statistical Science; Lovric, M., Ed.; Springer: Berlin/Heidelberg, Germany, 2011; Part 5; pp. 436–439. [Google Scholar]
- Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Berkeley, CA, USA, 20 June–30 July 1961; University of California Press: Berkeley, CA, USA, 1961; Volume 1, pp. 547–561. [Google Scholar]
- Peitgen, H.-O.; Jürgens, H.; Saupe, D. Chaos and Fractals: New Frontiers of Science, 2nd ed.; Springer: New York, NY, USA, 2004. [Google Scholar]
- Tsallis, C. Possible generalization of Boltzmann–Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Norwich, K.H. Information, Sensation, and Perception; Academic Press: San Diego, CA, USA, 1993. [Google Scholar]
- Aczél, J.; Daróczy, Z. On Measures of Information and Their Characterizations; Academic Press: New York, NY, USA, 1975. [Google Scholar]
- Kvålseth, T.O. Entropy evaluation based on value validity. Entropy 2014, 16, 4855–4873. [Google Scholar] [CrossRef]
- Hand, D.J. Measurement Theory and Practice; Wiley: London, UK, 2004. [Google Scholar]
- Kvålseth, T.O. The Lambda distribution and its applications to categorical summary measures. Adv. Appl. Stat. 2011, 24, 83–106. [Google Scholar]
- Marshall, A.W.; Olkin, I.; Arnold, B.C. Inequalities: Theory of Majorization and Its Applications, 2nd ed.; Springer: New York, NY, USA, 2011. [Google Scholar]
- Bullen, P.S. A Dictionary of Inequalities; Addison Wesley Longman: Essex, UK, 1998. [Google Scholar]
- Aczél, J. Lectures on Functional Equations and Their Applications; Academic Press: New York, NY, USA, 1966. [Google Scholar]
- Hardy, G.H.; Littlewood, J.E.; Pólya, G. Inequalities; Cambridge University Press: Cambridge, UK, 1934. [Google Scholar]
- Ebanks, B. Looking for a few good means. Am. Math. Mon. 2012, 119, 658–669. [Google Scholar]
- Morales, D.; Pardo, L.; Vajda, I. Uncertainty of discrete stochastic systems: General theory and statistical inference. IEEE Trans. Syst. Man Cyber Part A 1996, 26, 681–697. [Google Scholar] [CrossRef]
- Patil, G.P.; Taillie, C. Diversity as a concept and its measurement. J. Am. Stat. Assoc. 1982, 77, 548–567. [Google Scholar] [CrossRef]
- Kvålseth, T.O. Coefficients of variation for nominal and ordinal categorical data. Percept. Mot. Skills 1995, 80, 843–847. [Google Scholar] [CrossRef]
- Kvålseth, T.O. Variation for categorical variables. In International Encyclopedia of Statistical Science; Lovric, M., Ed.; Springer: Berlin/Heidelberg, Germany, 2011; Part 22; pp. 1642–1645. [Google Scholar]
- Kvålseth, T.O. Cautionary note about R2. Am. Stat. 1985, 39, 279–285. [Google Scholar]
- Nawrocki, D.; Carter, W. Industry competitiveness using Herfindahl and entropy concentration indices with firm market capitalization data. Appl. Econ. 2010, 42, 2855–2863. [Google Scholar] [CrossRef]
- Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Match. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
- Lin, J. Divergence measures based on Shannon entropy. IEEE Trans. Inf. Theory 1991, 37, 145–151. [Google Scholar] [CrossRef]
- Wong, A.K.C.; You, M. Entropy and distance of random graphs with application to structural pattern recognition. IEEE Trans. Pattern Anal. Mach. Intell. 1985, PAMI-7, 599–609. [Google Scholar] [CrossRef]
- Sagar, R.P.; Laguna, H.G.; Guevara, N.L. Electron pair density information measures in atomic systems. Int. J. Quantum Chem. 2011, 111, 3497–3504. [Google Scholar] [CrossRef]
- Antolin, J.; Angulo, J.C.; López-Rosa, S. Fisher and Jensen–Shannon divergences: Quantitative comparisons among distributions. Application to position and momentum atomic densities. J. Chem. Phys. 2009, 130, 074110. [Google Scholar] [CrossRef] [PubMed]
- Endres, D.M.; Schindelin, J.E. A new metric for probability distributions. IEEE Trans. Inf. Theory 2003, 49, 1858–1860. [Google Scholar] [CrossRef] [Green Version]
- Shannon, C.E. The bandwagon. IRE Trans. Inf. Theory 1956, 2, 3. [Google Scholar] [CrossRef]
- Weiner, N. What is information theory? IRE Trans. Inf. Theory 1956, 2, 48. [Google Scholar] [CrossRef]
- Bishop, Y.M.M.; Fienberg, S.E.; Holland, P.W. Discrete Multivariate Analysis: Theory and Practice; MIT Press: Cambridge, MA, USA, 1975. [Google Scholar]
Data Set | n | D | |||||
---|---|---|---|---|---|---|---|
1 | 16 | 2.17 | 1.28 | 3.89 | 0.14 | 0.09 | 0.26 |
2 | 19 | 11.14 | 10.32 | 11.22 | 0.62 | 0.57 | 0.62 |
3 | 3 | 0.96 | 0.82 | 0.97 | 0.48 | 0.41 | 0.49 |
4 | 18 | 4.28 | 3.21 | 5.05 | 0.25 | 0.19 | 0.15 |
5 | 15 | 1.84 | 1.40 | 2.08 | 0.13 | 0.10 | 0.15 |
6 | 15 | 12.65 | 11.42 | 12.57 | 0.90 | 0.82 | 0.90 |
7 | 13 | 10.37 | 9.93 | 10.34 | 0.86 | 0.83 | 0.86 |
8 | 15 | 4.13 | 3.01 | 3.76 | 0.30 | 0.22 | 0.27 |
9 | 7 | 5.38 | 4.83 | 5.25 | 0.90 | 0.81 | 0.88 |
10 | 4 | 0.78 | 0.65 | 0.86 | 0.26 | 0.22 | 0.29 |
11 | 12 | 1.67 | 1.23 | 3.83 | 0.15 | 0.11 | 0.35 |
12 | 17 | 14.71 | 14.56 | 14.71 | 0.92 | 0.91 | 0.92 |
13 | 14 | 4.09 | 3.88 | 4.10 | 0.31 | 0.30 | 0.32 |
14 | 8 | 6.04 | 5.39 | 5.95 | 0.86 | 0.77 | 0.85 |
15 | 17 | 10.01 | 9.03 | 10.13 | 0.63 | 0.56 | 0.63 |
16 | 5 | 0.57 | 0.41 | 0.74 | 0.14 | 0.10 | 0.19 |
17 | 10 | 4.85 | 3.82 | 5.10 | 0.54 | 0.42 | 0.57 |
18 | 5 | 1.00 | 0.73 | 1.27 | 0.25 | 0.18 | 0.32 |
19 | 5 | 2.05 | 1.16 | 2.09 | 0.51 | 0.40 | 0.52 |
20 | 19 | 12.00 | 11.05 | 12.06 | 0.67 | 0.61 | 0.67 |
21 | 19 | 13.78 | 13.64 | 13.78 | 0.77 | 0.76 | 0.77 |
22 | 17 | 6.93 | 6.33 | 7.40 | 0.43 | 0.40 | 0.46 |
23 | 12 | 8.16 | 7.71 | 8.17 | 0.74 | 0.70 | 0.74 |
24 | 20 | 13.56 | 11.98 | 13.83 | 0.71 | 0.63 | 0.73 |
25 | 17 | 3.47 | 3.07 | 3.58 | 0.22 | 0.19 | 0.22 |
26 | 13 | 1.15 | 0.72 | 2.29 | 0.10 | 0.06 | 0.19 |
27 | 14 | 10.69 | 9.66 | 10.65 | 0.82 | 0.74 | 0.82 |
28 | 12 | 1.68 | 1.66 | 1.68 | 0.15 | 0.15 | 0.15 |
29 | 20 | 9.72 | 6.88 | 11.09 | 0.51 | 0.36 | 0.58 |
30 | 9 | 5.23 | 4.20 | 5.36 | 0.65 | 0.53 | 0.67 |
3.00 | 0.75 | 1.50 | 0.93 | 0.75 | |
1.00 | 0.25 | 0.78 | 0.48 | 0.25 | |
6.87 | 0.76 | 2.09 | 0.91 | 0.77 | |
8.22 | 0.63 | 2.10 | 0.80 | 0.67 | |
2.32 | 0.77 | 1.28 | 0.92 | 0.74 | |
2.45 | 0.61 | 1.36 | 0.84 | 0.61 | |
1.47 | 0.37 | 1.01 | 0.63 | 0.37 | |
3.45 | 0.86 | 1.56 | 0.97 | 0.84 |
© 2016 by the author; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kvålseth, T.O. On the Measurement of Randomness (Uncertainty): A More Informative Entropy. Entropy 2016, 18, 159. https://doi.org/10.3390/e18050159
Kvålseth TO. On the Measurement of Randomness (Uncertainty): A More Informative Entropy. Entropy. 2016; 18(5):159. https://doi.org/10.3390/e18050159
Chicago/Turabian StyleKvålseth, Tarald O. 2016. "On the Measurement of Randomness (Uncertainty): A More Informative Entropy" Entropy 18, no. 5: 159. https://doi.org/10.3390/e18050159
APA StyleKvålseth, T. O. (2016). On the Measurement of Randomness (Uncertainty): A More Informative Entropy. Entropy, 18(5), 159. https://doi.org/10.3390/e18050159