Units: 3-0-9 H,G Class Times: Monday and Wednesday: 1:00 pm - 2:30 pm Location: 46-3002 (Singleton Auditorium) Instructors: Tomaso Poggio (TP), Lorenzo Rosasco (LR), Georgios Evangelopoulos (GE)
TAs: Michael Lee, Amauche Emenari, Andres Campero-Nunez
Office Hours: Friday 1:00 pm - 2:00 pm, 46-5156 (Poggio lab lounge) and/or 46-5165 (MIBR Reading Room) Email Contact: 9.520@mit.edu Previous Class: FALL 2016, 2015 lecture videos Registration: Please register to 9.520/6.860 by filing this registration form Mailing list: Registered students will be added in the course mailing list (9520students) Stellar page: http://stellar.mit.edu/S/course/9/fa17/9.520/ Course description
The course covers foundations and recent advances of Machine Learning from the point of view of Statistical Learning and Regularization Theory.
Understanding intelligence and how to replicate it in machines is arguably one of the greatest problems in science. Learning, its principles and computational implementations, is at the very core of intelligence. During the last decade, for the first time, we have been able to develop artificial intelligence systems that can solve complex tasks, until recently the exclusive domain of biological organisms, such as computer vision, speech recognition or natural language understanding: cameras recognize faces, smart phones understand voice commands, smart speakers/assistants answer questions and cars can see and avoid obstacles.
The machine learning algorithms that are at the roots of these success stories are trained with labeled examples rather than programmed to solve a task. Among the approaches in modern machine learning, the course focuses on regularization techniques, that provide a theoretical foundation to high-dimensional supervised learning. Besides classic approaches such as Support Vector Machines, the course covers state of the art techniques using sparsity or data geometry (aka manifold learning), a variety of algorithms for supervised learning (batch and online), feature selection, structured prediction, and multitask learning and principles for designing or learning data representations. Concepts from optimization theory useful for machine learning are covered in some detail (first order methods, proximal/splitting techniques,...).
The final part of the course will focus on deep learning networks. It will introduce an emerging theory formalizing three key areas for the rigorous characterization of deep learning: approximation theory -- which functions can be represented efficiently?; optimization theory -- how easy is it to minimize the training error?; and generalization properties -- is classical learning theory sufficient for deep learning? It will also outline a theory of hierarchical architectures that aims to explain how to build machine that learn using cortex principles and similar to how children learn: from few labeled and many more unlabeled data.
The goal of the course is to provide students with the theoretical knowledge and the basic intuitions needed to use and develop effective machine learning solutions to challenging problems.Prerequisites
We will make extensive use of basic notions of calculus, linear algebra and probability. The essentials are covered in class and in the math camp material. We will introduce a few concepts in functional/convex analysis and optimization. Note that this is an advanced graduate course and some exposure on introductory Machine Learning concepts or courses is expected. Students are also expected to have basic familiarity with MATLAB/Octave.Grading
Requirements for grading are attending lectures/participation (10%), four problems sets (60%) and a final project (30%).
Grading policies, pset and project tentative dates: (slides)
Problem Sets
Problem Set 1, out: Sep. 20, due: Sun., Oct. 01 (Class 08).
Problem Set 2, out: Oct. 04, due: Sun., Oct. 15 (Class 11).
Problem Set 3, out: Oct. 25, due: Sun., Nov. 05 (Class 17).
Problem Set 4, out: Nov. 10, due: Tue., Nov. 21 (Class 22). Submission instructions: Follow the instructions included with the problem set. Use the latex template for the report (there is a maximum page limit). Submit your report online through stellar.mit by the due date/time and a printout in the first class after the due date.
Projects
Guidelines and key dates. Online form for project proposal (complete by Nov. 01).
Reports are 1-page, extended abstracts using NIPS style files Projects archive
List of Wikipedia entries, created or edited as part of projects during previous course offerings.
Syllabus
Follow the link for each class to find a detailed description, suggested readings, and class slides. Some of the later classes may be subject to reordering or rescheduling.
Class Date Title Instructor(s)
Reading List
Notes covering the classes will be provided in the form of independent chapters of a book currently in draft format. Additional information will be given through the slides associated with classes (where applicable). The books/papers listed below are useful general reference reading, especially from the theoretical viewpoint. A list of additional suggested readings will also be provided separately for each class.Book (draft)
- L. Rosasco and T. Poggio, Machine Learning: a Regularization Approach, MIT-9.520 Lectures Notes, Manuscript, Dec. 2017 (provided).
Primary References
- S. Shalev-Shwartz and S. Ben-David. Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, 2014.
- T. Hastie, R. Tibshirani and J. Friedman. The Elements of Statistical Learning. 2nd Ed., Springer, 2009.
- I. Steinwart and A. Christmann. Support Vector Machines. Springer, 2008.
- O. Bousquet, S. Boucheron and G. Lugosi. Introduction to Statistical Learning Theory. Advanced Lectures on Machine Learning, LNCS 3176, pp. 169-207. (Eds.) Bousquet, O., U. von Luxburg and G. Ratsch, Springer, 2004.
- N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press, 2000.
- F. Cucker and S. Smale. On The Mathematical Foundations of Learning. Bulletin of the American Mathematical Society, 2002.
- F. Cucker and D-X. Zhou. Learning theory: an approximation theory viewpoint. Cambridge Monographs on Applied and Computational Mathematics. Cambridge University Press, 2007.
- L. Devroye, L. Gyorfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. Springer, 1997.
- T. Evgeniou, M. Pontil and T. Poggio. Regularization Networks and Support Vector Machines. Advances in Computational Mathematics, 2000.
- T. Poggio and S. Smale. The Mathematics of Learning: Dealing with Data. Notices of the AMS, 2003.
- V. N. Vapnik. Statistical Learning Theory. Wiley, 1998.
- V. N. Vapnik. The Nature of Statistical Learning Theory. Springer, 2000.
- S. Villa, L. Rosasco, T. Poggio. On Learnability, Complexity and Stability. Empirical Inference: Festschrift in Honor of Vladimir N. Vapnik, Chapter 7, pp. 59-70, Springer-Verlag, 2013.
- T. Poggio and F. Anselmi. Visual Cortex and Deep Networks: Learning Invariant Representations, Computational Neuroscience Series, MIT Press, 2016.
- T. Poggio, H. Mhaskar, L. Rosasco, B. Miranda, and Q. Liao. Why and When can Deep-but not Shallow-Networks Avoid the Curse of Dimensionality: A Review. International Journal of Automation and Computing, 1-17, 2017.
- T. Poggio and Q. Liao. Theory II: Landscape of the Empirical Risk in Deep Learning. CBMM Memo 66, 2017.
Resources and links
- Machine Learning 2017-2018. University of Genoa, graduate ML course.
- L. Rosasco, Introductory Machine Learning Notes, University of Genoa, ML 2016/2017 lectures notes, Oct. 2016.
Announcements
- [11/10] Problem Set 4 is out. Due date is Tue, Nov. 21, 11:59pm. Check mailing list announcement.
- [10/26] Project proposal is due on Nov. 01 --complete the online form (with title and abstract). Guidelines posted.
- [10/25] Problem Set 3 is out. Due date is Sun, Nov. 05, 11:59pm. Check mailing list announcement.
- [10/04] Problem Set 2 is out. Due date is Sun, Oct. 15, 11:59pm. Check mailing list announcement.
- [09/25] Room change: Class 07 (Wed. Sep 27) will be in room 46-3189.
- [09/20] Problem Set 1 is out. Due date is Sun, Oct. 01, 11:59pm. Check mailing list announcement.
- [09/20] Current course offering lecture videos (Fall 2017) are available online (2017 lecture videos).
- [09/12] Initial draft of course notes/book has been released. Check mail for instructions.
- [09/12] Office hours announced: Friday 1:00-2:00 pm, 46-5156 (Poggio lab lounge) and/or 46-5165 (MIBR Reading Room).
- [09/11] Piazza page is now active. Details sent through the mailing list.
- [09/11] Classroom for the course is changing to 46-3002 (Singleton Auditorium), for the rest of the term starting Class 02, except Wed. 09/27, Wed. 10/11 (will use 46-3189).
- [09/08] Math camp extra class, optional for those interested: Tue. 09/12, 4:00 pm - 5:30 pm, Singleton auditorium (46-3002).
- [09/06] Slides for the grading requirements and due dates as discussed in Class 01: link.