MOC - Information Theory
Overview
About
This note serves as an index for all notes related to Information Theory - the mathematical study of quantification, storage, and communication of information, founded by Claude Shannon.
Core Concepts
Information theory bridges multiple domains of physics and provides the mathematical foundation for:
- Digital communication and data compression
- Cryptography and error correction
- Computational limits and quantum information
- Thermodynamic connections (entropy)
Key Figures
- Claude Shannon - Founded the field with his 1948 paper
- John Wheeler - “It from Bit” philosophy connecting information to physics
- Jacob Bekenstein - Information limits in bounded regions
- Rolf Landauer - Thermodynamic cost of computation
Related MOCs
Parent/Broader MOCs
- MOC - Physics - Physical foundations of information
- MOC - Mathematics - Mathematical foundations
Sibling MOCs (Same Level)
- MOC - Computer Science - Computational applications
- MOC - Philosophy - Ontological implications (consciousness, “It from Bit”)
Applications
- MOC - Data Science - Applied information concepts
- MOC - Artificial Intelligence - AI/ML theory
Notes
NOTE
Currently, there are individual notes with the
#Topic/Information Theorytag.
Appendix
Note created on 2025-12-31 and last modified on 2025-12-31.
Backlinks
(c) No Clocks, LLC | 2025