1 |
March 27 |
- Introduction to Deep Learning
- Course logistics
|
|
|
1 |
March 29 |
|
|
|
2 |
April 3 |
- Feedforward Deep Networks
- Backpropagation Start
- Assignment 1 Out (on Canvas)
|
|
|
2 |
April 5 |
|
|
- Suggested Reading: Chapter 6 of Goodfellow, Bengio and Courville
- Suggested Reading: Chapter 2 of Nielsen
- Suggested Reading: Calculus on Computational Graphs, Chris Olah
- Suggested Reading: Andrej Karpathy's notes (linked below in "Additional Linkage")
|
3 |
April 10 |
|
|
- Suggested Reading: Chapter 7 of Goodfellow, Bengio and Courville
- Suggested Reading: Sparsity and the LASSO, Tibshirani and Wasserman
- Suggested Reading: Regularization and Complexity Control in Neural Networks, Chris Bishop
- Suggested Reading: Bagging Regularizes, Poggio et al.
- Suggested Reading: Dropout: A Simple Way to Prevent Neural Networks from Overfitting, Srivastava et al.
- Suggested Reading: Adversarial Perturbations of Deep Neural Networks, Warde-Farley and Goodfellow
- Further Reading: Training with Noise is Equivalent to Tikhonov Regularization, Chris Bishop
- Further Reading: Rank, Trace-Norm and Max-Norm, Srebro and Shraibman
- Reference: Statistical Learning and Sparsity, Hastie, Tibshirani and Wainwright
|
3 |
April 12 |
- Optimization Methods --
- SGD, Momentum method
- Adaptive first order methods
- Batch Normalization
- Initialization Strategies
- Polyak Averaging
- Assignment 1 Due (13 April; 23:59)
|
|
|
4 |
April 17 |
- Convolutional Neural Networks
- Assignment 2 Out (April 16)
- Quiz 2
|
|
|
4 |
April 19 |
- Visualizing Convolutional Networks
- Variants (Locally Connected Networks, Tiled CNNs, Dilated Convolutions)
- Motivations: Neuroscientific
- Motivations: Efficiency, Equivariance, Invariance, parameter tying
- Project Proposals Due at midnight
|
|
- Reference: Models of Visual Cortex, Poggio and Serre
- Reference: Stochastic Neighbor Embedding, G. E. Hinton and S. Roweis
- Reference: Visualizing Data using t-SNE, L. van der Maaten and G. E. Hinton
- Reference: How to use t-SNE effectively, Distill
- Reference: Visualizing and Understanding Convolutional Networks, M. D. Zeiler and R. Fergus
- Reference: Striving for Simplicity: The All Convolutional Net, J. T. Springenberg, A. Dosovitskiy, T. Brox, M. Riedmiller
- Reference: Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps, K. Simonyan, A. Vedaldi, A. Zisserman
- Reference: Understanding Neural Networks Through Deep Visualization, J. Yosinski, J. Clune, A. Nguyen, T. Fuchs, H. Lipson
- Reference: Inceptionism: Going Deeper into Neural Networks , Google Research Blog
- Reference: Multi-Scale Context Aggregation by Dilated Convolutions, F. Yu and V. Koltun
- Classic Paper: Neocognitron: A Hierarchial Neural Network capabale of visual pattern recognition, K. Fukushima
|
5 |
April 24 |
- Ultra Deep Architectures
- Residual Networks
- ResNet in ResNet
- Highway Networks
- Fractal Networks
- Some Analysis of Residual Learning
- DenseNets
- Similarity Learning and Siamese Networks
- Assignment 2 Due (April 27; 23:59)
|
|
|
5 |
April 26 |
- CNNs on Graphs Introduction
- CNNs on Non-Euclidean Domains
- Locally Connected Networks
- Spectral Networks
- Graph Embedding: Basic
- Message Passing Neural Networks
- Quiz 3
|
|
- Suggested Reading: Spectral Networks and Locally Connected Networks on Graphs, J. Bruna, W. Zaremba, A. Szlam, Y. LeCun
- Suggested Reading: Deep Convolutional Networks on Graph-Structured Data, M. Henaff, J. Bruna, Y. LeCun
- Further Reading: Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, M. Defferrard, X. Bresson, P. Vandergheynst
- Further Reading: Geometric deep learning: going beyond Euclidean data, M. M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst
- Suggested Reading: Neural Message Passing for Quantum Chemistry, J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, G. E. Dahl
- Suggested Reading: Convolutional Networks on Graphs for Learning Molecular Fingerprints, D. Duvenaud, D. Maclaurin, J. Aguilera-Iparraguirre, R. Gómez-Bombarelli, T. Hirzel, A. Aspuru-Guzik, R. P. Adams
- Suggested Reading: Gated Graph Sequence Neural Networks, Y. Li, D. Tarlow, M. Brockschmidt, R. Zemel
|
6 |
May 01 |
- Sequence Learning with NNs
- Recurrent Neural Networks
- Design Patterns for RNNs
- Intro to long term dependencies and gated architectures
|
|
|
6 |
May 03 |
- Recap of RNNs
- Challenges with Long Term Dependencies
- Reservoir Computing (basic idea)
- Long Short Term Memory and GRU
- Attention Models: Introduction using applications in Machine Translation and Caption Generation
|
|
|
7 |
May 08 |
|
|
|
7 |
May 10 |
- Neural Networks with External Memory
- Memory Networks
- Neural Turing Machines
- Differentiable Neural Computers
|
- slides
- slides(flat)
- (slides on canvas for now)
| |
8 |
May 15 |
- Mid-Term Solutions Review
- Introduction to Deep Unsupervised Learning
- Linear Projections and Linear Autoencoders
- Sparse Coding
- Autoencoders
- Assignment 3 Due (16 May 23:59)
|
- slides
- slides(flat)
- (slides on canvas for now)
| |
8 |
May 17 |
- Recap of PCA, ICA, Sparse Coding
- Ditributed versus Localist Representations
- Autoencoders wrap-up (sans training)
- Quiz 4
|
- slides
- slides(flat)
- (slides on canvas for now)
| |
9 |
May 22 |
- Linear Factor Models and Probabilistic PCA as motivation for Deep Factor Models
- Energy Based Methods
- Sigmoid Belief Networks and Deep Belief Networks
- Boltzmann Machines and Restricted Boltzmann Machines set-up
|
|
|
9 |
May 24 |
- Linear Factor Models, SBNs, DBNs, RBMs recap
- Approximate Learning in Restricted Boltzmann Machines
- Contrastive Divergence
- Boltzmann Machines
- Implicit density models: Intro
|
- slides
- slides(flat)
- (slides on canvas for now)
|
|
10 |
May 29 |
- Memorial day holiday: No Class
|
|
|
10 |
May 31 |
- Implicit density models
- Generative adversarial networks
- Wasserstein GANs
- Bonus topic: AlphaGo
- Quiz 5
|
- slides
- slides(flat)
- (slides on canvas for now)
|
|
11-12 |
June 6-10 |
- Project presentations and discussions
- Final project report and code due 10 June 23:59
|
|
|