Model selection for large-scale learning

Credits

3 ECTS, C. 18h

Instructor

Emilie Devijver

Objectives

When estimating parameters in a statistical model, sharp calibration is important to get optimal performances. In this course, we will focus on the selection of estimators with respect to the data. Particularly, we will consider calibration of parameters (e.g., regularization parameter for minimization of regularized empirical risk, like Lasso or Ridge estimators) and model selection (where each estimator minimizes the empirical risk on a specified model, as mixture models with several number of clusters).

We will focus on the penalized empirical risk, where the penalty may be deterministic (as BIC or ICL) or estimated with data (as the slope heuristic).

Prerequisites: Basic knowledges in probability and statistics

Target skills: Learn

When model selection is needed.

What can be proved theoretically for existing methods.

How those results can help in practice to choose a criterion for some specific statistical problem

How the theory can serve to de fine new procedures of selection.