Tuesdays and Thursdays 10:30-11:50AM in TTIC 530
Instructor: Nati Srebro.
TA: Jialei Wang, Email: jialei@uchicago.edu
Office hours: Fridays 1:40-3:00PM, TTIC Library
The purpose of this course is to gain a deeper understanding of machine learning by formalizing learning mathematically, studying both statistical and computational aspects of learning, and understanding how these two aspects are inseparable. The course is intended both for students interested in using machine learning methods and that would like to understand such methods better so as to use them more effectively, as well as for students interested in the mathematical aspects of learning or that intend on rigorously studying or developing learning algorithms.
We will discuss classic results and recent advances in statistical learning theory (mostly under the agnostic PAC model), touch on computational learning theory, and also explore the relationship with stochastic optimization and online regret analysis. Our emphasis will be on concept development and on obtaining a rigorous quantitative understanding of machine learning. We will also study techniques for analyzing and proving performance guarantees for learning methods.
Week of | Tuesday | Thusday |
---|---|---|
September 27th | What is Learning? | PAC Learning and VC Theory I |
October 3rd | PAC Learning and VC Theory II | MDL and PAC-Bayes |
October 10th | Computational Complexity of Learning | Proper vs Improper Learning |
October 17th | Agnostic Learning | Boosting and Compression Schemes |
October 24th | Real-Valued Loss | Scale-Sensitive Classes |
October 31th | SVMs, L1 Regularization, Boosting | Regularized Learning, Stability |
November 7th | Online Learning | FTRL,OGD |
November 14th | Online Dual Averaging, Mirror Descent | Online to Batch, Stochastic Optimization |
November 21th | Optimistic Rate, KNN | No Class |
November 28th | Neural Networks, Course Summary | No Class |