Jaume Riba and Gregori V´zquez a February 16, 2012
Contents
0.1 0.2 Scope of the course . . . . . . . . . . . . . . . . . . . . . . . . Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 5 8 8 9 9 11 13 14 15 16 17 21 32 32 33 33 34 35 36 36 40 41 42 51 52 53
1 Capacity 1.1 A Definition of Information . . . . . . . . . . . . . . . . . . . 1.1.1 The discrete memory-less source . . . . . . . . . . . . 1.1.2 Measure of information . . . . . . . . . . . . . . . . . 1.1.3 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1.4 A fundamental inequality . . . . . . . . . . . . . . . . 1.1.5 Maximizing entropy . . . . . . . . . . . . . . . . . . . 1.1.6 Joint entropy of two sources of information . . . . . . 1.1.7 Entropy for random vectors . . . . . . . . . . . . . . . 1.1.8 Uniquely-decodable codes, prefix-free codes and KraftMcMillan inequality . . . . . . . . . . . . . . . . . . . 1.1.9 Source coding theorem . . . . . . . . . . . . . . . . . . 1.1.10 Main plot and conclusions . . . . . . . . . . . . . . . . 1.1.11 Self-evaluation . . . . . . . . . . . . . . . . . . . . . . 1.2 Mutual Information and Channel Capacity . . . . . . . . . . 1.2.1 The discrete memory-less channel . . . . . . . . . . . . 1.2.2 Mutual information . . . . . . . . . . . . . . . . . . . 1.2.3 Intermediate concepts: average conditional entropy (randomness and equivocation) . . . . . . . . . . . . . 1.2.4 Complexity + Anticipation = Uncertainty + Action . 1.2.5 Interpretation as expectation of a random variable . . 1.2.6 Mutual information for random vectors . . . . . . . . 1.2.7 Capacity . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.8 Channel coding theorem . . . . . . . . . . . . . . . . . 1.2.9 Self-evaluation . . . . . . . . . . . . . . . . . . . . . . 1.3 Continuous Time/Amplitude Channels . . . . . . . . . . . . . 1.3.1 The AWGN channel . . . . . . . . . . . . . . . . . . . 1
1.3.2 1.3.3 1.3.4 1.3.5 1.3.6 1.3.7