Hardware-oriented Neural Architecture Search
Luca Robbiano (Ph.D. Student)
Manual neural architecture design is a time-consuming process that requires a massive effort by experts. In the last few years, this led to the development of Neural Architecture Search algorithms, which aim to replace the traditional design approach with more efficient automated methods. This is even more relevant if we consider that deep neural networks are becoming more and more pervasive in the industry, and the ability to deploy them without researchers' intervention can be crucial for their large-scale adoption. However, most of the approaches proposed so far typically focus mainly on performance, but deep learning models might need to work on various devices, ranging from microcontrollers to multi-node GPU clusters. Different platforms have different requirements, which lead to different optimal architectures, and it is unlikely that a hardware-agnostic search algorithm could work in all cases. This project aims to solve this issue by developing scalable Neural Architecture Search methods able to satisfy hardware-related requirements and ease the development and deployment of deep learning models for real-world applications, from tiny to large-scale computing.
|Primary Host:||Barbara Caputo (Politecnico di Torino & Italian Institute of Technology)|
|Exchange Host:||Fabio Maria Carlucci (Facebook)|
|PhD Duration:||01 November 2021 - 31 October 2024|
|Exchange Duration:||01 February 2024 - 31 August 2024 - Ongoing|