MultiEarth 2023

CVPR 2023 Workshop on Multimodal Learning for Earth and Environment

Mission of the workshop

The Multimodal Learning for Earth and Environment Workshop (MultiEarth 2023) is the second annual CVPR workshop aimed at leveraging the significant amount of remote sensing data that is continuously being collected to aid in the monitoring and analysis of the health of Earth ecosystems. The goal of the workshop is to bring together the Earth and environmental science communities as well as the multimodal representation learning communities to examine new ways to leverage technological advances in support of environmental monitoring. In addition, through a series of public challenges, the MultiEarth Workshop hopes to provide a common benchmark for remote sensing multimodal information processing. These challenges are focused on the monitoring of the Amazon rainforest and include deforestation estimation, fire detection, cross-modal image translation, and environmental change projection.

Invited Speakers

Assistant Professor, MIT 

13:15-13:45 (30 min) 

 Assistant Professor, EPFL

13:45-14:15 (30 min)  

Dates and Deadlines

 June 19 - MultiEarth 2023 Workshop @ CVPR

Paper Track:

May 29 - Paper submission deadline

      June 06 - Author notification

   June 13 - Camera-ready deadline

Challenge Track:

April 18 - Challenge training data released

May 29 May 31 - Evaluation server open for the test set, with leaderboard available

June 3 June 6 - Evaluation server closes

June 06 11:59PM PST June 13 11:59PM PST - Model and Paper submission deadline

Call For Papers

We are soliciting papers that use machine learning to address problems in earth and environmental science and monitoring, including but not limited to the following topics:

Paper Submission Guidelines

MultiEarth Challenge

A key component of this challenge is to monitor the Amazon rainforest in all weather and lighting conditions using our multimodal remote sensing dataset, which includes a time series of multispectral and synthetic aperture radar (SAR) images. This year’s challenge includes two focus areas: 1) rapid detection and 2) projection of environmental change in the Amazon. We propose to conduct the following challenges to support the interpretation and analysis of the rainforest at any time and any weather conditions:


Participants can download the training dataset with azcopy (setup instructions here) as follows:

azcopy cp "" /local/directory --recursive

The data is stored in two different formats for convenience. They are NetCDF files and zip files holding TIFF images. The desired version can be downloaded with azcopy using appropriate wildcards (e.g., *.zip or *.nc).

Participants can also access the data directly at the following URLs





Starter Code

This code repository holds tools for working with the large quantity of remote sensing data provided for these challenges. It includes datasets related to each challenge to aid in loading and filtering data from the provided NetCDF files. In addition, there are some simple utilities to aid in retrieving spatially and temporally aligned TIFF images.

Submission Guidelines

Deforestation Estimation

Fire Detection

SAR-to-EO Image Translation

Environmental Change Prediction

Organizing Committe



  doi = {10.48550/ARXIV.2204.07649},

  url = {},

  author = {Cha, Miriam and Huang, Kuan Wei and Schmidt, Morgan and Angelides, Gregory and Hamilton, Mark and Goldberg, Sam and Cabrera, Armando and Isola, Phillip and Perron, Taylor and Freeman, Bill and Lin, Yen-Chen and Swenson, Brandon and Piou, Jean},

  keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},

  title = {MultiEarth 2022 -- Multimodal Learning for Earth and Environment Workshop and Challenge},

  publisher = {arXiv},

  year = {2022},

  copyright = {Creative Commons Attribution 4.0 International}