Prerequisites

The school is intended for computer science researchers and PhD students (and potentially 2nd year MSc students), who have not necessarily been exposed to Machine Learning material in the past. It can also appeal to students in related areas (such as mathematics, physics, electrical, mechanical or biomedical engineering, etc.) who have interests in Machine Learning methodologies and their applications.

Below a bref listing of notions the audience is expected to be familiar with.

Probability and statistics

Random variables, distributions, quantiles, expectation, mean, variance, Gaussian distribution, conditional probability, Bayes' theorem, joint distributions, covariance, correlation, and independence.
dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/amsbook.mac.pdf

Linear algebra

Systems of linear equations, matrix operations, matrix-vector operations, including inverses, linear dependence and independence, eigen values and vectors, linear transformations, positive definite matrices, symmetric matrices, linear models and least-squares problems, orthogonal bases and orthogonal projections, subspaces and bases and dimensions.
math.ku.edu/~lerner/LAnotes/LAnotes.pdf

Calculus

Functions, limits, continuity, derivatives, differentiation rules, application to approximations and extremum problems, maximum and minimum of a function, convex and concave functions, multivariate functions, partial derivatives, gradients.
math.odu.edu/~jhh/Volume-1.PDF

Optimization

Dynamic programming, gradient-descent
dynamic programming: from novice to advanced
wikipedia.org/wiki/Gradient_descent

Algorithms

Basic notions of algorithms, data structures and programming, time and space complexities of an algorithm, polynomial vs non-polynomial complexities.
hackerearth.com/practice/basic-programming/complexity-analysis/time-and-space-complexity/tutorial/