Call for papers

Call for Papers

The 2nd Workshop on Neural Architecture Search (NAS 2021)

Collocated with ICLR 2021, May 7, 2021

Web: https://sites.google.com/view/nas2021

----------------------------------------------------------------

Important Dates:

  • February 19th: Submission system opens

  • February 26th (23:59 UTC-12): Submission deadline

  • March 5th (23:59 UTC-12): Wild Idea submission deadline

  • March 19th: Reviewing deadline

  • March 26th: Author notification

  • April 24th (23:59 UTC-12): Camera Ready Copy due

  • May 7th: Workshop day

----------------------------------------------------------------

Neural Architecture Search (NAS) is the logical next step in automating the learning of representations. It follows upon the recent transition from manual feature engineering to automatically learning features (using a fixed neural architecture) by replacing manual architecture engineering with automated architecture design. NAS can be seen as a subfield of automated machine learning (AutoML) and has significant overlap with hyperparameter optimization and meta-learning. NAS methods have already outperformed manually-designed architectures on several tasks, such as image classification, object detection or semantic segmentation. They have also already found architectures that yield a better trade-off between resource consumption on target hardware and predictive performance.


Our main targets are machine learning researchers interested in understanding and improving current NAS methods, but ML researchers planning to apply existing NAS methods to novel domains are also amongst the target community. We invite submissions on the following topics:

  • Bayesian optimization

  • Evolutionary algorithms

  • Reinforcement learning

  • Hyperparameter optimization

  • Meta-learning and transfer learning

  • Predictive models of performance

  • Multi-fidelity optimization methods

  • Benchmarking and robustness of NAS

  • Applications of NAS

  • Weight-sharing NAS

  • Gradient-based NAS

  • Weight inheritance based NAS


Since NAS often relies on high computational demands and substantial engineering efforts, a particular focus of this workshop is on reproducibility. We therefore encourage authors to open-source their code and results and will take this into account in the decision making process. As part of the submission, authors will have to indicate which parts of the NAS best practices checklist they satisfy.

Submissions should be up to 4 pages in ICLR format (plus references and an appendix of up to 10 pages). All accepted papers will be presented as posters. We may invite the best 2-3 papers for an oral plenary presentation. Unless indicated by the authors, we will provide PDFs of all accepted papers on the workshop website. There will be no archival proceedings. The submission platform will be announced soon.


WILD IDEAS TRACK

With an exponentially increasing number of publications and an ever growing community, NAS is arguably one of the hottest subfields of machine learning at the moment. This fast pace of advancement in the field makes it particularly hard for junior scientists, which often lack guidance and support, to enter this fast moving field. To avoid the often devastating experience of investing a lot of energy into an idea and then repeatedly getting it rejected at major machine learning conferences, we plan to have a second ‘wild ideas’ submission track in parallel to the regular workshop submissions to provide early feedback to junior researchers. For this track, we encourage the submission of a 2-page paper that sketches out a research idea related to NAS which should eventually lead to a full conference submission, along with planned experiments to validate this idea. Each submission will be reviewed by one experienced researcher from the community who will provide constructive feedback regarding the general idea and point to potential pitfalls in the experimental setup. This will hopefully be helpful to get projects onto the right track early on and increase the chance of a later successful paper submission. The submission platform (TBA) will be the same as for the normal workshop track.


Keynote Speakers

  • Jeff Clune (Open AI): “Meta-learning synthetic data to accelerate neural architecture search”

Jeff Clune is a research team leader at OpenAI. Before that, he was a Senior Research Manager and founding member of Uber AI Labs, which was formed after Uber acquired a startup he helped lead. Jeff focuses on deep learning, deep reinforcement learning, and robotics. Prior to Uber, he was the Loy and Edith Harris Associate Professor in Computer Science at the University of Wyoming. Before that he was a Research Scientist at Cornell University and received degrees from Michigan State University (PhD, master’s) and the University of Michigan (bachelor’s). More on Jeff’s research can be found at JeffClune.com or on Twitter (@jeffclune).


  • Fabio Carlucci (Huawei): “Generator Based Architecture Search: advantages and applications”

Fabio Maria Carlucci is a researcher from the Huawei Noah’s Ark Lab in London. He completed his PhD in Machine Learning and Computer Science at Sapienza University of Rome. He has also collaborated with the Italian Institute of Technology, and was a member of Prof. Caputo's VANDAL lab. While at Huawei, he has been mainly working on the topic of AutoML, with a particular focus on Neural Architecture Search.


  • Margret Keuper (Uni Mannheim): TBA

Margret Keuper is a Juniorprofessor for Computer Vision at the University of Mannheim, Germany. Before joining the University of Mannheim, she worked as a postdoctoral researcher for the University of Freiburg and at the Max Planck Institute for Informatics in Saarbruecken. She did her Ph.D. under the supervision of Thomas Brox at the University of Freiburg. Currently, Margret and her group are interested in neural architecture search and optimization for computer vision. Their research focus is on efficient searchable graph representation spaces and neural architecture generation.


  • Xuanyi Dong (University of Technology Sydney): “Extending the Search from Architecture to Hyperparameter, Hardware, and System”

Xuanyi Dong is a researcher at University of Technology Sydney. He has taken seven internships in industrial companies, including Google Brain and Facebook Reality Lab. During his Ph.D. study, he has published more than 20 papers on CVPR/ICCV/TPAMI/etc with 1800+ citations. He was awarded the 2019 Google Ph.D. Fellowship and 2020 Baidu Scholarship. He was one of the winners at TRECVID Video Localization 2016 and ranked second at ILSVRC Object Localization 2015.


Location

As the main ICLR 2021 conference, the 2nd NAS workshop will be fully virtual and will take place on May 7th. Details on how exactly this will work will be provided soon.