6 ECTS.
(En anglais.) The aim of this course is to provide the students with the fundamental concepts and tools for developing and analyzing machine learning algorithms. The course will introduce the theoretical foundations of machine learning, review the most successful algorithms with their theoretical guarantees, and discuss their application in real world problems. The covered topics are: — Introduction to the different paradigms of ML and applications
- Bayes rule, MLE, MAP
- the fully bayesian setting
- Computational learning theory
- Empirical Risk Minimization
- Universal consistency
- ERM and ill-posed problems
- biais-variance tradeoff
- Regularization
- PAC model and Sample complexity
- MDL and Sample compression bounds
- VC-dimension and Sauer’s lemma
- Radermacher complexity
- Overfitting and Regularization
- SRM
- Online learning
- Model selection, cross validation
- Supervised learning
- knn, naive bayes
- Logistic regression and beyond
- Perceptron
- kernelized perceptron and SVM
- Kernel methods
- Decision trees and Random Forests
- Multiclass and ranking algorithms
- Unsupervised learning
- Dimensionality reduction: PCA, ICA, Kernel PCA, ISOMAP, LLE
- Density estimation
- EM
- mixtures of gaussians
- Spectral clustering
- Ensemble methods: bagging, boosting, gradient boosting
Bibliographie, lectures recommandées
- Mohri, M., Rostamizadeh, A., & Talwalkar, A. (2012). Foundations of machine learning. MIT press.
- Shalev-Shwartz, S., & Ben-David, S. (2014). Understanding machine learning: From theory to algorithms. Cambridge university press.
- Vapnik, V. (2013). The nature of statistical learning theory. Springer science & business media.
- Bishop Ch. (2006). Pattern recognition and machine learning. Springer – Friedman, J., Hastie, T., & Tibshirani, R. (2001). The elements of statistical learning (Vol. 1, No. 10). New York, NY, USA:: Springer series in statistics.
- James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). An introduction to statistical learning (Vol. 112). New York: springer.