The school is intended for computer science researchers and PhD students (and potentially 2nd year MSc students), who have not necessarily been exposed to Machine Learning material in the past. It can also appeal to students in related areas (such as mathematics, physics, electrical, mechanical or biomedical engineering, etc.) who have interests in Machine Learning methodologies and their applications.

Below a bref listing of notions the audience is expected to be familiar with.

Probability and statistics

Random variables, distributions, quantiles, expectation, mean, variance, Gaussian distribution, conditional probability, Bayes' theorem, joint distributions, covariance, correlation, and independence.

Linear algebra

Systems of linear equations, matrix operations, matrix-vector operations, including inverses, linear dependence and independence, eigen values and vectors, linear transformations, positive definite matrices, symmetric matrices, linear models and least-squares problems, orthogonal bases and orthogonal projections, subspaces and bases and dimensions.


Functions, limits, continuity, derivatives, differentiation rules, application to approximations and extremum problems, maximum and minimum of a function, convex and concave functions, multivariate functions, partial derivatives, gradients.


Dynamic programming, gradient-descent
dynamic programming: from novice to advanced


Basic notions of algorithms, data structures and programming, time and space complexities of an algorithm, polynomial vs non-polynomial complexities.