Seznam

Téma:Restarting optimization algorithms using their predicted performance
Vedoucí:Ing. Petr Pošík Ph.D.
Vypsáno jako:Diplomová práce,Bakalářská práce,Dobrovolná odborná práce
Popis:In the article below, the researchers optimize the configurations of Deep neural networks, and they throw away bad configurations by extrapolating the learning curves of DNN. The goal of this project is to assess whether and how a similar principle can be applied to black-box search algorithms (local search, evolutionary algorithms, etc.) by observing and predicting their convergence curves. The study can assess various models for algorithm performance prediction, e.g. a parametric exponential model, or a non-parametric model based on resampling the past algorithm improvements.
Literatura:[1] Tobias Domhan, Jost Tobias Springenberg, Frank Hutter: Speeding up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves. University of Freiburg
Vypsáno dne:10.05.2019