Information and Coding Theory - Winter 2025


TTIC 31200/CMSC 37220

T Th 11-12:20 (TTIC 530)

Discussion: F 2-3 pm (TTIC 530)

Office hours: Th 12:30-1:30 pm (TTIC 505)

Instructor: Madhur Tulsiani

TA: Kavya Ravichandran




 

This course is meant to serve as an introduction to some basic concepts in information theory and error-correcting codes, and some of their applications in computer science and statistics. We plan to cover the following topics:

  • Introduction to entropy and source coding. Some applications of entropy to counting problems.
  • Mutual information and KL-divergence. Method of types and hypothesis testing. Minimax rate bounds.
  • I-projections, maximum entropy, exponential families and applications.
  • Introduction to error-correcting codes. Unique and list decoding of Reed-Solomon and Reed-Muller codes.
  • Applications of information theory to problems in theoretical computer science.

The course will have 4 homeworks (60 percent) and a final (40 percent).


There is no textbook for this course, since no single book covers all the topics discussed in the course. A very useful reference is ``Elements of Information Theory'' by T. M. Cover and J. A. Thomas. We will also post links to other materials in the resources section below.



Homeworks and Announcements




Lecture Plan and Notes


  • 1/7: Reminder on convexity. Entropy, prefix-free codes and Kraft's inequality.
    [Slides]   [Notes]
  • 1/9: Conditional and joint entropy. Subadditivity of entropy and combinatorial applications. Shearer's lemma.
    [Slides]   [Notes]
  • 1/14: Mutual Information. Data-processing inequality and Fano's inequality.
    [Slides]   [Notes]
  • 1/16: Graph entropy. KL-divergence and some properties.
    [Slides]   [Notes]


Resources