Abstract
Information theory is, at its core, an approach to quantifying and mathematically analyzing information. While originally applied to transmissions, it can be applied to any information. Information theory is a foundational topic for modern cryptography. Without a basic working knowledge of information theory, it can be extremely difficult to understand modern cryptography. This chapter will provide the reader with the essential concepts of information theory. This includes an introduction to Claude Shannon’s work, discussion of key topics such as diffusion and Hamming weight, as well as basic equations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bookstein, A., Kulyukin, V. A., & Raita, T. (2002). Generalized hamming distance. Information Retrieval, 5(4), 353-375.
Gray, R. M. (2011). Entropy and information theory. Springer Science & Business Media.
Guizzo, E. M. (2003). The essential message: Claude Shannon and the making of information theory (Doctoral dissertation, Massachusetts Institute of Technology).
Hayashi, M. (2012, May). Quantum security analysis via smoothing of Renyi entropy of order 2. In Conference on Quantum Computation, Communication, and Cryptography (pp. 128-140). Springer, Berlin, Heidelberg.
Hu, Q., Guo, M., Yu, D., & Liu, J. (2010). Information entropy for ordinal classification. Science China Information Sciences, 53(6), 1188-1200.
Kakihara, Y. (2016). Abstract methods in information theory (Vol. 10). World Scientific.
Larson, E. J. (2004). Evolution: The remarkable history of a scientific theory (Vol. 17). Random House Digital, Inc..
Linke, N. M., Johri, S., Figgatt, C., Landsman, K. A., Matsuura, A. Y., & Monroe, C. (2018). Measuring the Rényi entropy of a two-site Fermi-Hubbard model on a trapped ion quantum computer. Physical Review A, 98(5), 052334.
Wan, F., Wei, P., Jiao, J., Han, Z., & Ye, Q. (2018). Min-entropy latent model for weakly supervised object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 1297-1306).
Yamano, T. (2002). Source coding theorem based on a nonadditive information content. Physica A: Statistical Mechanics and its Applications, 305(1-2), 190-195.
Yeung, R. W. (2012). A first course in information theory. Springer Science & Business Media.
Author information
Authors and Affiliations
Test Your Knowledge
Test Your Knowledge
-
1.
The difference in bits between two strings X and Y is called Hamming distance.
-
2.
If you take 1110 XOR 0101, the answer is _______.
-
3.
A change in one bit of plain text leading to changes in multiple bits of cipher text is called _______.
-
4.
The amount of information that a given message or variable contains is referred to as _______.
-
5.
_________ refers to significant differences between plain text, key, and cipher text that make cryptanalysis more difficult.
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Easttom, C. (2022). Basic Information Theory. In: Modern Cryptography. Springer, Cham. https://doi.org/10.1007/978-3-031-12304-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-12304-7_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-12303-0
Online ISBN: 978-3-031-12304-7
eBook Packages: Computer ScienceComputer Science (R0)