Robust and Reproducible Neural Architecture Search
Arber Zela (Ph.D. Student)
Neural Architecture Search (NAS) is the next logical step towards the automation of deep learning systems, due to its potential to achieve state-of-the-art performance on various tasks and remove the need of manually designing neural network architectures. The first black-box NAS methods required a vast amount of computational power to be competitive, making it possible to run experiments with company-scale resources, leading to a reproducibility crisis. Current NAS algorithms became more efficient via gradient-based methods, however this also led to algorithms which are more brittle and led to poor performance in many scenarios. The goal of my research is to develop robust NAS methods and benchmarks that would lead to fair comparisons of different components of a NAS algorithm without confounding factors.
Primary Host: | Frank Hutter (University of Freiburg) |
Exchange Host: | Yee Whye Teh (University of Oxford & DeepMind) |
PhD Duration: | 01 March 2019 - Ongoing |
Exchange Duration: | 01 July 2021 - 31 August 2021 - Ongoing |