Transformers
for
Environmental Science
22nd - 23rd September 2022
Magdeburg
Bringing the Earth Science and
Transformer communities together
Attention-based neural networks are having a profound impact on areas such as natural language processing and computer vision. Their application to the Earth sciences is still in its infancy but has similar potential, for example, because of the complex nonlocal spatio-temporal interactions in this domain. In this workshop, we want to bring researchers from machine learning (ML) and Earth science together to discuss how transformers and related neural networks can lead to progress in applications such as weather and climate projections, vegetation analysis, remote sensing, downscaling (super resolution), and air pollution predictions. We also want to explore the challenges these applications pose to attention-based ML models, for example, what are appropriate embeddings of the data, what are suitable attention mechanisms and sparsity patterns, and how to represent many interacting physical fields in such architectures.
The workshop will take place in an informal setting on 22nd - 23rd September at Lukasklause conference center in Magdeburg, Germany.
TOPICS
Time-series Transformers
Vision Transformers
Spatio-temporal Transformers
Distributed and parallel setup for large-size Transformers
Weather forecasting/nowcasting
Transformer learning strategies
Landcover classification
Downscaling/Super resolution
Extreme value and anomaly detection
Physical interpretation of Transformers
IMPORTANT DATES
Notification: 15th Aug 2022
Registration: 1st Sept 2022
Workshop: 22nd - 23rd Sept 2022
Please register for the event using this link : https://indico3-jsc.fz-juelich.de/event/33/registrations/20/ and book your hotel room (see accomodation page)
The Speakers
Jonathan Godwin (Deepmind)
Poster Session
Sensitivity Analysis and Machine Learning of a Sea Ice Melt Pond Parametrisation
Speaker : Simon DriscollSpatio-Temporal Modeling on Multispectral Satellite Data using Self-Supervised Learning
Speaker : Akansha Singh BansalTransformers for Agriculture and Forest Monitoring
Speaker : Michael GrezaCrop Classification using Quantum Transformers for Time Series Remote Sensing Data from Sentinel-2 satellites
Speaker : Sreejit DuttaCan transformer methods be used to create fast emulators for forward and partial derivative computations in ocean modeling?
Speaker : Suyash BireLatticeFormer: Interpretable Fluid Simulations with Attention
Speaker : Robert BrockhoffTemporal Fusion Transformers – time-series forecasting in an energy context
Speaker : Jonas KochUsing tweets to improve weather predictions
Speaker : Kristian EhlertOn the potential of transformer networks for air quality forecasting
Speaker : Martin SchultzProtein / RNA Folding by Learning
Speaker : Fabrice Lehr