Exploration in RL
Workshop @ ICML 2019 on Saturday, June 15.
8:30am - 5:30pm in Hall A
Contact: erl-leads@google.com
Workshop @ ICML 2019 on Saturday, June 15.
8:30am - 5:30pm in Hall A
Contact: erl-leads@google.com
Exploration is a key component of reinforcement learning (RL). While RL has begun to solve relatively simple tasks, current algorithms cannot complete complex tasks. Our existing algorithms often endlessly dither, failing to meaningfully explore their environments in search of high-reward states. If we hope to have agents autonomously learn increasingly complex tasks, these machines must be equipped with machinery for efficient exploration.
The goal of this workshop is to present and discuss exploration in RL, including deep RL, evolutionary algorithms, real-world applications, and developmental robotics. Invited speakers will share their perspectives on efficient exploration, and researchers will share recent work in spotlight presentations and poster sessions. These perspectives include (but not limited to):
We invite all researchers working on problems related to exploration to submit a 4 page paper (not including references or appendix) in PDF format using the following template and style guidelines. The template points to an Overleaf project with an updated style file with an updated footnote that indicates the paper was published at our workshop -- please use this style sheet when submitting the camera-ready versions. You may download the whole template or just the icml2019.sty file to use the updated footnote.
We allow papers that are currently under review to the submitted to the workshop, as we are not publishing workshop proceedings. This means that, for instance, papers that were submitted to NeurIPS can be submitted to this workshop.
Please upload your anonymized submission to CMT. The review process will be double-blind. Accepted papers will be presented as posters or as spotlight talks. We highly encourage authors to release open-source implementations of their ideas, and we will provide links to those implementations on our website.
Important Dates:
Frequently Asked Questions
The exploration workshop is designed to broadly appeal to diverse groups within the machine learning community. The sheer volume of work related to exploration in RL published in the last few years (see Recent Papers) demonstrates the large interest in this area. Most immediately, the workshop targets researchers studying algorithms for efficient exploration and the impact of those algorithms on real-world applications. This includes, but is not limited to research on deep RL, developmental robotics, evolutionary algorithms. More broadly, we encourage researchers from related fields such as unsupervised learning, causal inference, generative models, and Bayesian modeling to attend and explore connections between these fields and exploration in RL.
ERL 2019 is the second iteration of the inaugural workshop in Exploration in RL from ICML 2018. We will be uploading all videos and slides from ERL 2019 and post them on this website after the workshop! The archive of videos and slides from previous years can be found below:
The following YouTube playlist has all the talks from the workshop:
https://www.youtube.com/playlist?list=PLbSAfmOMweH3YkhlH0d5KaRvFTyhcr30bSlides for all contributed talks are available here:
https://docs.google.com/presentation/d/1zkqtsM-GywKN9kzX4r0j-C1SUF5I0N0mgsxpfvJyl7s