Second Edition
Solutions to Problems
Thomas M. Cover
Joy A. Thomas
October 17, 2006
1
COPYRIGHT 2006
Thomas Cover
Joy Thomas
All rights reserved
2
Contents
1 Introduction
7
2 Entropy, Relative Entropy and Mutual Information
9
3 The Asymptotic Equipartition Property
49
4 Entropy Rates of a Stochastic Process
61
5 Data Compression
97
6 Gambling and Data Compression
139
7 Channel Capacity
163
8 Differential Entropy
203
9 Gaussian channel
217
10 Rate Distortion Theory
241
11 Information Theory and Statistics
273
12 Maximum Entropy
301
13 Universal Source Coding
309
14 Kolmogorov Complexity
321
15 Network Information Theory
331
16 Information Theory and Portfolio Theory
377
17 Inequalities in Information Theory
391
3
4
CONTENTS
Preface
Here we have the solutions to all the problems in the second edition of Elements of Information
Theory. First a word about how the problems and solutions were generated.
The problems arose over the many years the authors taught this course. At first the homework problems and exam problems were generated each week. After a few years of this double duty, the homework problems were rolled forward from previous years and only the exam problems were fresh. So each year, the midterm and final exam problems became candidates for addition to the body of homework problems that you see in the text. The exam problems are necessarily brief, with a point, and reasonable free from time consuming calculation, so the problems in the text for the most part share these properties.
The solutions to the problems were generated by the teaching assistants and graders for the weekly homework assignments and handed back with the graded homeworks in the class immediately following the date the assignment was due. Homeworks were optional and did not enter into the course grade. Nonetheless most students did the homework. A list of the many students who contributed to the solutions is given
Bibliography: Proc. National Acad. Sci. U.S., 36:31–35, 1950. Soc., 51:414–421, 1955. [6] R.G. Gallager. Information Theory and Reliable Communication. Wiley, New York, 1968. [7] R.G. Gallager. Variations on a theme by Huffman. IEEE Trans. Inform. Theory, IT24:668–674, 1978. [8] L. Lovasz. On the Shannon capacity of a graph. IEEE Trans. Inform. Theory, IT-25:1–7, 1979. Veb Deutscher Verlag der Wissenschaften, Berlin, 1962. [12] C.E. Shannon. Communication theory of secrecy systems. Bell Sys. Tech. Journal, 28:656–715, 1949. [13] C.E. Shannon. Coding theorems for a discrete source with a fidelity criterion. IRE National Convention Record, Part 4, pages 142–163, 1959. Stat. Prob., volume 1, pages 611–644. Univ. California Press, 1961. [15] J.A. Storer and T.G. Szymanski. Data compression via textual substitution. J. ACM, 29(4):928–951, 1982.