Communication theory is a field of information theory and mathematics that studies the technical process of information and the process of human communication.
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point."Claude Shannon (1916–2001)
The origins of communication theory is linked to the development of information theory in the early 1920s. Limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability.
Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system.
Ralph Hartley's 1928 paper, Transmission of Information, uses the word "information" as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information.
Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.
The main landmark event that opened the way to the development of communication theory was the publication of an article by Claude Shannon in the Bell System Technical Journal in July and October 1948 under the title "A Mathematical Theory of Communication". Shannon focused on the problem of how best to encode the information that a sender wants to transmit. He used also tools in probability theory, developed by Norbert Wiener. They marked the nascent stages of applied communication theory at that time. Shannon developed information entropy as a measure for the uncertainty in a message while essentially inventing the field of information theory.