Omar Montasser

Omar Montasser

omar@ttic.edu

Hi! I am a fourth year PhD student at the Toyota Technological Institute at Chicago. I am advised by Professor Nathan Srebro. I am interested in the theory of machine learning. Recently, I have been thinking about questions related to adversarially robust learnability.

Before TTIC, I completed a five year program (combined BS/MS) in computer science and engineering at Penn State. During my stay there, I have had the pleasure of working with Professors Daniel Kifer and Sean Hallgren on problems in machine learning and quantum computational complexity.

Publications

Adversarially Robust Learning with Unknown Perturbation Sets
Omar Montasser, Steve Hanneke, and Nathan Srebro
Preprint, February 2021.

Beyond Perturbations: Learning Guarantees with Arbitrary Adversarial Test Examples
Shafi Goldwasser, Adam Tauman Kalai, Yael Tauman Kalai, Omar Montasser
NeurIPS, 2020. (Spotlight)

Reducing Adversarially Robust Learning to Non-Robust PAC Learning
Omar Montasser, Steve Hanneke, and Nathan Srebro
NeurIPS, 2020.

Efficiently Learning Adversarially Robust Halfspaces with Noise
Omar Montasser, Subhi Goel, Ilias Diakonikolas, and Nathan Srebro
ICML, 2020.

Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity
Pritish Kamath, Omar Montasser, and Nathan Srebro
COLT, 2020.

VC Classes are Adversarially Robustly Learnable, but Only Improperly
Omar Montasser, Steve Hanneke, and Nathan Srebro
COLT, 2019. Best Student Paper Award!

Predicting Demographics of High-Resolution Geographies with Geotagged Tweets
Omar Montasser and Daniel Kifer
AAAI, 2017 (Oral).

Teaching