The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning.
ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.
Participants at ICLR span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.
Due to growing concerns about COVID-19, ICLR2020 will cancel its physical conference this year, instead shifting to a fully virtual conference. We were very excited to hold ICLR in Addis Ababa, and it is disappointing that we will not all be able to come together in person in April. This unfortunate event does give us the opportunity to innovate on how to host an effective remote conference. The organizing committees are now working to create a virtual conference that will be valuable and engaging for both presenters and attendees. Further information, including information about the venue, is available on their website.
Like the main conference (https://medium.com/@iclr_conf/format-for-the-iclr2020-virtual-conference-76716ddea640), the NAS workshop will also be a mix of pre-recorded talks, asynchronous engagement, and live engagement through Q&A and in-person video calls. On the workshop day (26th of April):
Several previous workshops, such as the AutoML workshop series at ICML 2014-2019, which have consistently drawn about 200-300 participants (up to 1000 in 2019), and the Meta-Learning workshop series at NeurIPS 2017-2019 with about 500-600 attendees, have had a healthy overlap in attendance with the NAS community. This overlap has been growing especially over the last few years, as NAS became a crucial component for representation learning in many AutoML systems. The AutoML 2019 and Meta-Learning 2019 workshops attracted 50 and 84 submissions respectively, some of which leveraged NAS.