EECS 496 - Computational Learning Theory
Northwestern University
Winter 2019


This course will introduce some of the central topics in computational learning theory, a field which approaches the question "whether machines can learn" from the perspective of theoretical computer science. We will study well defined and rigorous mathematical models of learning where it will be possible to give precise and rigorous analysis of learning problems and algorithms. A big focus of the course will be the computational efficiency of learning in these models. We will develop some provably efficient algorithms and explain why such provable algorithms are unlikely for other models.

We will only cover topics which permit a mathematically rigorous analysis and hence mathematical maturity is absolutely necessary. In particular, some familiarity with basic probability (such as linearity of expectation) and basic linear algebra will be necessary. Also, the emphasis of the course will be on proofs, so if you are in this course, you should enjoy proofs and math.

Example topics include inductive inference, query learning, PAC learning and VC theory, Occam's razor, online learning, boosting, support vector machines, bandit algorithms, statistical queries, Rademiacher complexity, and neural networks.

This course satisfies the Theory Depth Requirement.

Basic Information

Instructor: Lev Reyzin
Syllabus: TBD
Time and Location: T,Th 9:30-10:50am
Required Textbook: Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar. Foundations of Machine Learning, 2nd edition
Optional Textbook: Shai Shalev-Shwartz and Shai Ben-David. Understanding Machine Learning: From Theory to Algorithms
Office Hours: TBD

Exam Dates

Problem Sets

Lectures and Readings

Note: lectures will have material not covered in the readings.