Florian Hübler

PhD
ETH Zurich
Opportunities and Limitations of Parameter-Free Optimization

Tuning hyperparameters, such as the step-size, presents a major challenge of training machine learning models. To address this challenge, numerous adaptive optimization algorithms have been developed that are guaranteed to converge, even when step-sizes are chosen independent of problem-specific parameters such as the smoothness-parameter. However, the adaptiveness of these algorithms results in sample complexities that are strictly worse than that of algorithms with access to problem-specific parameters. It is currently unknown whether this gap stems from suboptimal analyses and algorithms, or from an inherent difficulty in optimizing without access to problem-dependent parameters. This project aims to clarify this issue by either establishing lower bounds that illustrate fundamental limitations of parameter-free algorithms compared to those having access to problem-dependent parameters, or by refining current algorithms and analyses to narrow this performance gap.

Track:
Academic Track
ELLIS Edge Newsletter
Join the 6,000+ people who get the monthly newsletter filled with the latest news, jobs, events and insights from the ELLIS Network.