This course is meant to serve as an introduction to some basic concepts in
information theory and error-correcting codes, and some of their applications in
computer science and statistics. We plan to cover the following topics:
- Introduction to entropy and source coding. Some applications of entropy to
counting problems.
- Mutual information and KL-divergence. Method of types and hypothesis
testing. Minimax rate bounds.
- I-projections, maximum entropy, exponential families and applications.
- Introduction to error-correcting codes. Unique and list decoding of
Reed-Solomon and Reed-Muller codes.
- Applications of information theory to problems in theoretical computer science.
The course will have 4-5 homeworks (60 percent) and a final (40 percent).
There is no textbook for this course, since no single book covers all the topics
discussed in the course. A very useful reference is ``Elements of
Information Theory'' by T. M. Cover and J. A. Thomas. We will also post links to
other materials in the resources section below.
Lectures will be streamed via Panopto, office hours can also be joined via
Zoom. Please see the Canvas page of the course for links.