Deep Internal Learning:

Training with no prior examples

ECCV2020 Workshop, Friday, August 28, 2020

Exploiting the internal statistics of natural signals has long been recognized as a powerful paradigm in computer vision, as well as in other domains. For example, the redundancy within a single natural image has been repeatedly shown to provide highly relevant supervision for many tasks, and to often allow completely unsupervised inference (requiring no training examples besides the test-input itself). The field of Internal Learning is recently seeing a revived surge of research, with works that train deep neural networks on a single training example. These methods achieve state-of-the-art performance in several tasks, while being significantly more flexible than externally trained networks. The goal of this workshop is to present the recent advances in deep internal learning, and to foster discussions on this important field, which is attracting growing interest in our community.

Keynotes

+ Recording

Short Orals

We will give an opportunity to showcase work related to “Deep Internal Learning”, either previously published or yet-to-be published. Accepted submissions will be presented as 5 min pre-recorded short orals during the main workshop. Note: Submitted work would not appear in any proceedings.

Short orals are now available

Organizers