Information and Coding Theory - Autumn 2017

TTIC 31200/CMSC 37220

M W 1:30-2:50, TTIC Room 526

Discussion: Th 4-5 pm, TTIC Room 530

This course is meant to serve as an introduction to some basic concepts in information theory and error-correcting codes, and some of their applications in computer science and statistics. We plan to cover the following topics:

• Introduction to entropy and source coding. Some applications of entropy to counting problems.
• Mutual information and KL-divergence. Method of types and hypothesis testing.
• I-projections, maximum entropy, exponential families and applications.
• Introduction to error-correcting codes. Unique and list decoding of Reed-Solomon and Reed-Muller codes.
• Applications of information theory to problems in theoretical computer science.

The course will have 4-5 homeworks (60 percent) and a final (40 percent).

There is no textbook for this course. A useful reference is ``Elements of Information Theory'' by T. M. Cover and J. A. Thomas. Also take a look at the resources section below.

Homeworks and Announcements

Lecture Plan and Notes

• 9/25: Entropy of a random variable. Prefix-free codes and Kraft's inequality.
[Notes]
• 9/27: Conditional and joint entropy. Subadditivity of entropy and combinatorial applications. Fundamental source coding theorem.
[Notes]
• 10/2: Proof of Shearer's lemma. Some more combinatorial and number-theoretic applications of entropy. Mutual information.
[Notes]
• 10/4: NO CLASS
• 10/9: Data processing inequality, KL-divergence, total-variation distance, Pinsker's inequality.
[Notes]
• 10/11: Pinsker's inequality (contd), lower bound for distinguishing coins. Differential entropy and KL-divergence.
[Notes]
• 10/16: Method of types, large deviations and Sanov's theorem.
[Notes]
• 10/18: Hypothesis testing, Neyman-Pearson lemma, Fano's inequality.
[Notes]
• 10/23: Minimax rate bounds and reduction to hypothesis testing, Le Cam's method.
[Notes]
• 10/25: Minimax rate bounds via multiple hypotheses testing, covering and packing numbers.
[Notes]
• 10/30: I-projections and their properties, maximum entropy distributions.
[Notes]
• 11/1: Matrix scaling. Introduction to error-correcting codes, linear codes.
[Notes]
• 11/6: Linear codes, Elementary bounds on codes, Reed-Solomon codes and the Berlekamp-Welch decoding algorithm.
[Notes]
• 11/8: (Guest lecture by Prof. Razborov) A short introduction to communication complexity
• 11/13: (Guest lecture by Prof. Razborov) Information complexity
• 11/15: (Guest lecture by Prof. Ohannessian) The Minimum Description Length principle
• 11/20: List decoding of Reed-Solomon codes, Reed-Muller codes
[Notes]
• 11/22: Distance of Reed-Muller codes, local decoding, concatenated codes

Resources