|Instructor||Nicolas Macris||Instructor||Ruediger Urbanke|
|Office||INR 134||Office||INR 116|
|Office Hours||By appointment||Office Hours||By appointment|
|Teaching Assistant||Chan Chun Lamemail@example.com||Office||INR032|
|Teaching Assistant||Kirill Ivanov
|Teaching Assistant||Clement Luneau
|Credits :||4 ECTS|
- Analysis I, II, III
- Linear Algebra
- Machine learning
- Algorithms (CS-250)
Here is a link to official coursebook information.
Some homework will be graded.
If you do not hand in your final exam your overall grade will be NA. Otherwise, your grade will be determined based on the following weighted average: 10 % for the Homework, 90 % for the Final Exam. For the graded homeworks, you can discuss the homework with other people. But you have to write down your own solution and note on the first page the set of people that you discussed with.
- PAC learning model (based on Chapters 2-7 in Understanding Machine Learning (UML) by Shalev-Shwartz and Ben David)
- Gradient descent (UML and notes by A. Montanari)
- Graphical models (based on Chapters 1-5 and 9-11 in Bayesian Reasoning and Machine Learning by David Barber and Chap 8 in Pattern Recognition and Machine Learning by Christopher Bishop) PGM-Lect-1.pdf PGM-Lect-2.pdf Notes-Message-Passing.pdf PGM-Lect-3.pdf PGM-Lect-4.pdf
- Tensor decomposition (based on the review on tensor decompositions (Ranbaser, Shschur, Gunneman). For more advanced material see also: http://people.csail.mit.edu/moitra/docs/bookex.pdf) Tens-Lect-1.pdf Tens-Lect-2.pdf Tens-Lect-3.pdf
(tentative, subject to changes)
|18/2||Chap 3 and 4 (in UML)||3.1; 3.3; 3.7; 3.8; 4.1; 4.2||Solution 1|
|25/2||Chap 5 (in UML)||idem ++||Solution 2|
|4/3||Chap 6 (in UML)||Graded: 5.1; 6.2; 6.5; 6.8; 6.9; 7.3||Solution 3|
|11/3||Chap 7 (in UML)||idem|
|18/3||remaining of Chap 7 and Chap 14 start (in UML)||Deadline for handing in graded homework 19/3 during exercise session||Solution 4|
|25/3||remaining of Chap 14 (in UML)||2nd graded homework:
Deadline 16 April
|1/4||“Lecture notes on two-layer neural networks” by A. Montanari||Hand-out of 1st graded homework
(lecture and exercise session)
|8/4||Introduction to graphical probabilistic models
(Chap 3 and 4 in D. Barber and Chap 8 in C. Bishop)
|2nd graded homework continued (exercise 5).
Deadline is 16 April
|15/4||Factor graphs, Marginalization.
(Chap 4, 5 in Barber, Chap 8 in Bishop)
|29/4||Sampling: Ancestral sampling for belief Networks and MCMC.
Learning graphical models: (Barber paragraphs 9.3 and 9.6 mostly 9.6.1)
|Exercises 6 continued. Use notes on message passing for problems 8, 9, 10|
|6/5||Variational bayes EM, standard EM
Learning graphical models: (Barber 11.2 mostly 11.2.1, 11.2.2 and 11.5.
|3rd graded hmw
New Deadline: May 28.
|13/5||Tensor methods: Next three classes based on the Review
Tensor product, Rank, Jennrich’s thm
|4th graded hmw
New Deadline: June 4 in mailbox in IPG corridor (INR) or with the assistants.
ALS, multilinear rank, Tucker HOSVD
Applications: GMM, Topic models, multiview models.
If time permits: Power Method, Whitening
Textbooks and notes:
- Understanding Machine Learning by Shalev-Shwartz and Ben David
- Bayesian Reasoning and Machine Learning by David Barber(Cambridge)
- Pattern recognition and Machine Learning by Christopher Bishop (Springer)
- Introduction to Tensor Decompositions and their Applications in Machine Learning (Ranbaser, Shchur, Gunneman)
- Probability on Graphs. Random processes on graph and lattices by Geoffrey Grimmett (Cambridge) [Chap 7]