Information Theory and Coding
In communication systems, information theory, pioneered by C. E. Shannon, generally deals with mathematical formulation of the information transfer from one place to another. It is concerned with source coding and channel coding. Source coding attempts to minimize the number of bits required to represent the source output at a given level of efficiency. Channel coding, on the other hand, is used so that information can be transmitted through the channel with a specified reliability. Information theory answers the two fundamental questions in communication theory: what is the limit of data compression? (answer: entropy of the data H(X), is its compression limit) and what is the ultimate transmission rate of communication? (answer: the channel capacity C, is its rate limit) . Information theory also suggests means which systems can use for achieving these ultimate limits of communication. Amount of Information
In general, the function of a communication system is to convey information from the transmitter to the receiver that means the job of the receiver is therefore to identify which one from the number of allowable messages was transmitted. The measure of information is related to uncertainty of events. This means commonly occurring messages convey little information, whereas rare messages will carry more information. This idea is explained by a logarithmic measure of information which is first proposed by R.V. L. Hartley.
Consider a discrete source whose output x_i, with i = 1, 2, …, M, are a sequence of symbols chosen from a finite set {x_i }_(i=1)^M, the alphabet of the source. The message symbols are emitted from the source with the probability distribution p_x (x_i). So the discrete message source can be mathematically modeled as a discrete random process with a sequence of random variables taking values from the set with probability distribution p_x (x_i). Now let the source select and transmit a message