Stochastic Convex Optimization for Over-Parametrized models
Anant Raj (Ph.D. Student)
Over-parametrized models are frequently occurring phenomena in machine learning which comes with nice properties. In this work, we investigate methods to optimize such models. Our goal is to show faster convergence rate for traditional 1st order methods on such problems without extra assumptions and with similar computational cost. We also validate our theory experimentally.
Primary Host: | Bernhard Schölkopf (ELLIS Institute Tübingen & Max Planck Institute for Intelligent Systems) |
Exchange Host: | Francis Bach (INRIA & École Normale Supérieure) |
PhD Duration: | 01 August 2015 - 30 November 2020 |
Exchange Duration: | 04 October 2019 - 31 March 2020 - Ongoing |