Shannon Publishes "A Mathematical Theory of Communication"
Claude E. Shannon publishes "A Mathematical Theory of Communication" in the Bell System Technical Journal (Vol. 27, No. 3, pp. 379-423, July 1948; No. 4, pp. 623-656, October 1948), founding the field of information theory and laying the mathematical bedrock for the entire digital age. Working at Bell Telephone Laboratories in Murray Hill, New Jersey, Shannon introduces a rigorous mathematical framework for quantifying information — defining entropy (H = -Σ pᵢ log₂ pᵢ) as the measure of uncertainty in a message, formalizing the "bit" (binary digit) as the fundamental unit of information, and proving two landmark theorems. His source coding theorem establishes the theoretical minimum for lossless data compression. His noisy-channel coding theorem proves that reliable communication is possible over any noisy channel at rates below its capacity — an existence proof so counterintuitive that engineers initially refused to believe it. The channel capacity formula C = B·log₂(1 + S/N) becomes the governing equation of telecommunications for the next 75 years. Shannon's insight that all information — text, sound, images — can be reduced to binary digits and transmitted, stored, and processed mathematically is the conceptual foundation for digital computing, data compression (Huffman, Lempel-Ziv, MP3, JPEG), error-correcting codes, cryptography, and ultimately machine learning, where Shannon entropy resurfaces as the cross-entropy loss function that trains virtually every modern neural network. Shannon had discussed the possibility of "thinking machines" over teatime with Alan Turing during Turing's 1943 visit to Bell Labs, and would go on to co-organize the 1956 Dartmouth Workshop that formally launched artificial intelligence as a field. The paper, cited over 150,000 times, is arguably the most consequential scientific publication of the twentieth century after Einstein's relativity papers.