Neural Architecture Search

Frank Hutter · Arber Zela · Aaron Klein · Jan Metzen · Liam Li


(Note: Workshop posters and instructions are on the workshop site. Password for the zoom link is: seaslug .)

Description: Neural Architecture Search (NAS) can be seen as the logical next step in automating the learning of representations. It follows upon the recent transition from manual feature engineering to automatically learning features (using a fixed neural architecture) by replacing manual architecture engineering with automated architecture design. NAS can be seen as a subfield of automated machine learning and has significant overlap with hyperparameter optimization and meta-learning. NAS methods have already outperformed manually designed architectures on several tasks, such as image classification, object detection or semantic segmentation. They have also already found architectures that yield a better trade-off between resource consumption on target hardware and predictive performance. The goal of this workshop is to bring together researchers from industry and academia that focus on NAS. NAS is an extremely hot topic of large commercial interest, and as such has a bit of a history of closed source and competition. It is therefore particularly important to build a community behind this research topic, with collaborating researchers that share insights, code, data, benchmarks, training pipelines, etc, and together aim to advance the science behind NAS.