Abstract:
Earth’s satellite observation record now exceeds 50 years, and supports a vast range of science and applications. Multidecadal programs like Landsat have relied on a limited number of spectral bands that do not fully capture information about vegetation cover. In particular, past satellite missions have lacked the ability to accurately distinguish non-photosynthesizing vegetation (NPV), which includes plant litter, senesced leaves, and crop residues. NPV is an important indicator of ecosystem disturbance, agricultural resilience, drought severity, and wildfire danger, making accurate mapping of NPV as a component of vegetation cover a high priority for future satellite missions. This talk will describe how machine learning is being used to demonstrate the capabilities of upcoming Landsat Next and Surface Biology Geology satellite missions. These innovative missions will launch close to the end of the decade and should revolutionize global mapping of vegetation cover.
Bio:
Phil Dennison is a professor and director of the School of Environment, Society, and Sustainability at the University of Utah. His past work includes remote sensing of ecosystem disturbance, including drought and insect attack; remote sensing of fuels, active fire, and vegetation recovery following wildfire; remote sensing and GIS applications for improving firefighter safety; detection and mapping of point source greenhouse gas emissions; and spectral mixture modeling and estimation of fractional cover. He is a member of NASA’s FireSense Implementation Team, which is working to mature remote sensing technologies to address operational fire management needs.
Summary:
Landsat series of satellites: continuous set of space observations 1972 - Present
Wavelengths covered by different satellites overlap over time
The coverage has stayed relatively stable, not expanding much over the decades
Differentiating green vegetation (GV) vs non-veg (NV)
Vegetation is much more reflective > 700 nm, much less < than non-veg
Can model any pixel as a mixture of veg / non-veg
Challenge: how to identify vegetation that is not clearly green (e.g. dead or senesced)
“Non-photosynthetic vegetation” (NPV)
Examples: Stems, branches, Dry grass, Crop residue, vegetation infected by pests
Varies seasonally
Indicator of wildfire danger
The spectra from the Landsat satellites are not great at differentiating non-veg from NPV
Need near infra-red
Goal: develop ways to identify NPV using hyperspectral imagery and ML
Upcoming satellites will have broader spectral sensors
Upcoming Satellites (~2030 Launch):
Data:
Collect on-ground spectral data of spectral absorption from different plant types
12NPV, 6GV, 228 soils
Created spectrum mixtures to indicate what they would look like to a satellite with a given set of spectral sensors
Research question: What spectral info do we need to model fractional cover of NPV vs GV and soils?
2030nm would be optimal
However, this wavelength is absorbed by atmospheric gases (CO2), which creates noise in the measurements
2038nm - 2050nm is an open window that allows NPV to be detected
Landsat Next mission is expected to have this sensor
Also provides continuity with Landsat 8/9 and 7, which have spectra that overlap Landsat Next but coarser
How accurately can we model GV/NPV/soil using ML?
Focus on bands from Landsat Next sand SBG VSWIR
Trained many ML models, with Random Forest trained on hyperspectral synthetic mixtures achieving 15% RMSE on fractions cover (10% is the goal for highly usable prediction)
Analysis of model indicates that predictions for different cover types: GV/NPV/Soil includes features relevant to other types
Model is very accurate for GV; NPV and Soils are more challenging
Univariate models have similar accuracy to multi-variate models
Model applied to ground data
Much more variable: e.g. more samples that are purely Soil, GV or NPV
Much more challenging but error is still <10%
Observation: multi-variate modeling is much more successful at ensuring that the GV, NPV and Soil add up to 1
Need much better field data
Drones, etc. (at 1.5cm resolution)
Goal: classify crop types
Leveraging hyperspectral aerial observations for mineral exploration