(New York, American Telephone and Telegraph Company, 1951). 8vo. The complete issue in original blue printed wrappers. Pp. 50-64. [Entire volume: pp. 1-212.].
First edition of Shannon's famous article in which he measures the entropy rate of English text to be between 1.0 and 1.5 bits per letter, or as low as 0.6 to 1.3 bits per letter.
"A new method of estimating the entropy and redundancy of a language is described. This method exploits the knowledge of the language statistics possessed by those who speak the language, and depends on experimental results in prediction of the next letter when the preceding text is known. Results of experiments in prediction are given, and some properties of an ideal predictor are developed." (From the introduction to the present article).
"Natural languages are highly redundant; the number of intelligible fifty-letter English sentences is many fewer than 26*50, and the number of distinguishable ten-second phone conversations is far smaller than the number of sound signals that could be generated with frequencies up to 20.000 Hz. This immediately suggests a theory for signal compression. If you can recode the alphabet so that common sequences of letters and abbreviated, while infrequent combinations are spelled out in lengthy fashion, you can dramatically reduce the channel capacity needed to send the data." (Sethna, Statistical Mechanics: Entropy, Order Parameters and Complexity, Oxford University Press, 2006, pp. 100).
Order-nr.: 22783