Information and Coding Theory - Winter 2021


TTIC 31200/CMSC 37220

T Th 1:00-2:20 (Zoom)

Discussion: F 2-3 pm (Gather)

Office hours: W 1-2 pm (Gather)

Instructor: Madhur Tulsiani

TA: Goutham Rajendran




 

This course is meant to serve as an introduction to some basic concepts in information theory and error-correcting codes, and some of their applications in computer science and statistics. We plan to cover the following topics:

  • Introduction to entropy and source coding. Some applications of entropy to counting problems.
  • Mutual information and KL-divergence. Method of types and hypothesis testing. Minimax rate bounds.
  • I-projections, maximum entropy, exponential families and applications.
  • Introduction to error-correcting codes. Unique and list decoding of Reed-Solomon and Reed-Muller codes.
  • Applications of information theory to problems in theoretical computer science.

The course will have 4-5 homeworks (60 percent) and a final (40 percent).


There is no textbook for this course. A useful reference is ``Elements of Information Theory'' by T. M. Cover and J. A. Thomas. Also take a look at the resources section below.


Access information for lectures, discussions and office hours, is available via the Canvas page for the course.



Homeworks and Announcements




Lecture Plan and Notes


  • 1/12: Entropy of a random variable. Prefix-free codes and Kraft's inequality.
    [Jamboard]   [Notes]
  • 1/14: Conditional and joint entropy. Subadditivity of entropy and combinatorial applications. Fundamental source coding theorem.
    [Jamboard]   [Notes]
  • 1/19: Shearer's lemma and applications. Mutual information, Data processing inequality.
    [Jamboard]   [Notes]
  • 1/21: Fano's inequality. Graph entropy. KL-divergence and Pinsker's inequality.
    [Jamboard]   [Notes]
  • 1/26: Lower bounds for distinguishing biased coins, and for two-armed bandits.
    [Jamboard]   [Notes]
  • 1/28: Differential entropy and KL-divergence.
    [Jamboard]   [Notes]
  • 2/2: Method of types, large deviations and Sanov's theorem.
    [Jamboard]   [Notes]
  • 2/4: Hypothesis testing.
    [Jamboard]   [Notes]
  • 2/9: Minimax rate bounds.
    [Jamboard]   [Notes]
  • 2/11: Sparse mean estimation. I-projections and their properties, maximum entropy distributions.
    [Jamboard]   [Notes]
  • 2/16: Matrix scaling. Channel coding.
    [Jamboard]   [Notes]
  • 2/18: Linear codes. Explicit constructions via entropy polarization.
    [Jamboard]   [Notes]
  • 2/23: Adversarial error model, Reed-Solomon codes.
    [Jamboard]   [Notes]
  • 2/25: List decoding of Reed-Solomon codes. Basics of Reed-Muller codes
    [Jamboard]   [Notes]
  • 3/2: Local decoding of Reed-Muller codes. Intro to Communication Complexity.
    [Jamboard]  
  • 3/4: Communication lower bounds for Disjointness.
    [Jamboard]  
  • 3/9: Algorithmic Lovász Local Lemma via entropy compression
    [Jamboard]  
  • 3/11: Reed-Muller codes achieve capacity for erasure channels
    [Jamboard]  


Resources