Cross Modal Neural Architecture Search
Niccolò Cavagnero (Ph.D. Student)
Determining an optimal architecture is key to accurate deep neural networks (DNNs) with good generalisation properties. Neural architecture search (NAS) can potentially reduce the need for application-specific expert designers allowing for a wide-adoption of sophisticated networks in various industries. It has been showed that, by applying such algorithms, the resulting architectures can indeed outperform human-designed state-of-the-art convolutional networks. Researchers have used a wealth of techniques ranging from reinforcement learning, where a controller network is trained to sample promising architectures, to evolutionary algorithms that evolve a population of networks for optimal DNN design. Still, these approaches are inefficient and can be extremely computationally and/or memory intensive. Moreover, all existing approaches assume that training and test data are generated by the same underlying distribution. This is often not true, as usually the data seen by any algorithm at deployment time are generated by a different data distribution. The goal of this PhD thesis will be that of developing computationally efficient cross modal neural architecture search approaches. Although the application domain of the thesis will be visual recognition, it is expected that the results obtained in the thesis will be general and of interest for the machine learning community at large.
|Primary Host:||Barbara Caputo (Politecnico di Torino & Italian Institute of Technology)|
|Exchange Host:||Bernhard Nessler (Johannes Kepler University Linz)|
|PhD Duration:||01 October 2021 - 01 October 2024|
|Exchange Duration:||01 February 2024 - 01 August 2024 - Ongoing|