Detail of the student project

Topic:Restarting optimization algorithms using their predicted performance
Department:Katedra kybernetiky
Supervisor:Ing. Petr Pošík, Ph.D.
Announce as:Diplomová práce, Bakalářská práce, Semestrální projekt
Description:In the article below, the researchers optimize the configurations of Deep neural networks, and they throw away bad configurations by extrapolating the learning curves of DNN. The goal of this project is to assess whether and how a similar principle can be applied to black-box search algorithms (local search, evolutionary algorithms, etc.) by observing and predicting their convergence curves. The study can assess various models for algorithm performance prediction, e.g. a parametric exponential model, or a non-parametric model based on resampling the past algorithm improvements.
Bibliography:[1] Tobias Domhan, Jost Tobias Springenberg, Frank Hutter: Speeding up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves. University of Freiburg
Responsible person: Petr Pošík