Justin Park
Cemetree is an immersive VR visualization that allows users to witness 23 years of Amazon rainforest deforestation (2000-2023) using real Hansen Global Forest Change satellite data. A controlled user study (n=11) comparing VR and 2D visualizations demonstrated that VR was 16.3% more accurate for environmental data comprehension, with 100% of participants preferring VR for both accuracy and impact.
Key Results:
Forest Loss Visualized: 35.84% (224 of 625 trees) over 23 years
VR Accuracy Advantage: 16.3% over 2D (p < 0.05)
Participant Preference: 100% chose VR (11/11)
Development Time: 48 hours across 5 weeks
Platform: Meta Quest via Unity 6.3 LTS
Traditional 2D visualizations of Amazon deforestation fail to convey the scale and emotional impact of ecological devastation. This project tested whether VR's spatial immersion could improve comprehension compared to conventional 2D approaches.
Create immersive VR experience visualizing Amazon deforestation 2000-2023
Use authentic Hansen Global Forest Change satellite data
Enable temporal navigation through controller-based year switching
Deploy to Meta Quest for standalone VR
Evaluate effectiveness through controlled VR vs 2D user study
Dataset: Hansen GFC v1.11 (2000-2023)
Region: Rondônia, Brazil (10°S, 60°W)
Tile: Hansen_GFC-2023-v1.11_lossyear_10S_060W.tif
Forest Loss Statistics:
2000: 625 trees (baseline), 0.00% loss, 100.00% remaining
2005: 613 trees, 1.92% cumulative loss, 98.08% remaining
2010: 590 trees, 5.60% cumulative loss, 94.40% remaining
2015: 563 trees, 9.92% cumulative loss, 90.08% remaining
2020: 446 trees, 28.64% cumulative loss, 71.36% remaining
2023: 401 trees, 35.84% cumulative loss, 64.16% remaining
Critical Period: 2015-2020 saw 18.72% loss in just 5 years, accounting for 52% of total 23-year deforestation.
The complete workflow: Hansen GeoTIFF → Python Processing → PNG Files → Unity Import → Quest APK
Python Processing:
import rasterio
import numpy as np
from PIL import Image
def process_hansen_data(input_tif, output_dir):
    with rasterio.open(input_tif) as src:
        loss_year = src.read(1)
       Â
        year_codes = {2000: 0, 2005: 5, 2010: 10, 2015: 15, 2020: 20, 2023: 23}
       Â
        for year, code in year_codes.items():
            mask = np.where(
                (loss_year > 0) & (loss_year <= code),
                255, # Deforested
                0   # Forested
            ).astype(np.uint8)
           Â
            img = Image.fromarray(mask)
            img = img.resize((100, 100), Image.NEAREST)
            img.save(f"{output_dir}/trees_{year}.png")
Unity Texture Import Settings:
Read/Write Enabled: CRITICAL for runtime access
Compression: None
Format: RGBA 32 bit
Design: 25×25 grid (625 trees), 3×3 cluster-based deforestation detection
public class ForestGenerator : MonoBehaviour
{
    [SerializeField] private int gridSize = 25;
    [SerializeField] private float spacing = 30f;
    [SerializeField] private int clusterRadius = 1; // 3x3 clusters
    [SerializeField] private GameObject treePrefab;
    [SerializeField] private Texture2D[] yearTextures;
   Â
    private List<GameObject> trees = new List<GameObject>();
    public int currentYearIndex { get; private set; } = 0;
   Â
    void GenerateForest()
    {
        for (int x = 0; x < gridSize; x++)
        {
            for (int z = 0; z < gridSize; z++)
            {
                Vector3 position = new Vector3(x * spacing, 0, z * spacing);
                GameObject tree = Instantiate(treePrefab, position,Â
                    Quaternion.Euler(0, Random.Range(0, 360), 0), transform);
                tree.transform.localScale = Vector3.one * 10f;
                trees.Add(tree);
            }
        }
    }
    public void ShowYear(int yearIndex)
    {
        currentYearIndex = Mathf.Clamp(yearIndex, 0, yearTextures.Length - 1);
        Texture2D yearData = yearTextures[currentYearIndex];
       Â
        for (int x = 0; x < gridSize; x++)
        {
            for (int z = 0; z < gridSize; z++)
            {
                // Check 3x3 cluster for deforestation
                bool isDeforested = CheckCluster(yearData, x, z);
                int treeIndex = x * gridSize + z;
                trees[treeIndex].SetActive(!isDeforested);
            }
        }
    }
}
Input Mapping:
Right Trigger: Next year
Left Trigger: Previous year
Left Joystick: Movement (6-DOF flying)
Right Joystick: Rotation
public class VRYearSwitcher : MonoBehaviour
{
    [SerializeField] private ForestGenerator forestGenerator;
[SerializeField] private InputActionReference rightTrigger;
[SerializeField] private InputActionReference leftTrigger;
[SerializeField] private TextMeshProUGUI yearLabel;
    private void OnEnable()
    {
        rightTrigger.action.performed += ctx => {
 forestGenerator.ShowNextYear();
 UpdateYearDisplay();
        };
        leftTrigger.action.performed += ctx => {
            forestGenerator.ShowPreviousYear();
            UpdateYearDisplay();
        };
    }
    private void UpdateYearDisplay()
    {
    yearLabel.text = $"Year: {forestGenerator.years[currentYearIndex]}";
}
}
XR Origin Configuration:
Position: (1000, 30, 0) - elevated bird's-eye start
Continuous Move Provider: Flying enabled, gravity disabled
Movement Speed: 50 units/second
Shader Selection:
Initial approach used URP/Lit shader, which resulted in 15+ minute shader compilation timeout on Quest. Final solution switched to Mobile/Unlit (Supports Lightmap) shader, reducing build time to 2-5 minutes - an 85% improvement with negligible visual difference for this use case.
Problem: Unity Hub Android installation failed due to macOS security restrictions.
Solution:
Manual Android SDK download
Navigate to System Settings → Privacy & Security
Click "Open Anyway" for each blocked component
Configure Unity External Tools with manual paths
Time Cost: 4 hours of troubleshooting
Problem: URP/Lit shaders exceeded Quest compilation limits, causing build failures.
Solution: Switched all tree materials to Mobile/Unlit shader and disabled fog in lighting settings.
Result: Build time reduced from 15+ minutes to 2-5 minutes (85% improvement)
Problem: LockCameraPosition script overrode XR Origin movement, preventing all navigation.
Solution: Disabled the script and properly configured Continuous Move Provider with flying enabled and gravity disabled.
Learning: Test XR movement early in development and avoid custom camera override scripts that conflict with XR systems.
Type: Within-subjects, controlled comparison
Participants: 11 respondents
Duration: 10 minutes per participant
Date: March 10, 2026
Participants wore the Quest headset and explored the VR forest:
Estimate 2010 vs 2000 forest loss percentage (Actual: 5.60%)
Estimate 2020 vs 2000 forest loss percentage (Actual: 28.64%)
Estimate 2023 vs 2000 forest loss percentage (Actual: 35.84%)
Estimate trees lost between 2015→2020 (Actual: 117 trees)
Participants viewed the 2D side-view visualization:
Estimate 2005 vs 2000 forest loss percentage (Actual: 1.92%)
Estimate 2015 vs 2000 forest loss percentage (Actual: 9.92%)
Estimate 2023 vs 2000 forest loss percentage (Actual: 35.84%)
Estimate trees lost between 2020→2023 (Actual: 45 trees)
Control Question: Both formats asked about 2023 vs 2000 forest loss, enabling direct comparison on identical question.
After experiencing both visualizations:
Which visualization did you prefer for accuracy?
Which visualization did you prefer for showing impact?
Open-ended feedback on VR implementation
Open-ended suggestions for improvements
VR average error: 4.09 percentage points
2D average error: 4.89 percentage points
VR improvement: 16.3% more accurate (p < 0.05)
This represents a statistically significant improvement in comprehension accuracy when using VR visualization compared to traditional 2D approaches.
Both VR and 2D asked participants to estimate 2023 vs 2000 forest loss (Actual: 35.84%):
VR results: Average estimate 39.09%, error 5.53%
2D results: Average estimate 26.82%, error 9.78%
VR improvement: 43.5% more accurate on identical question
Analysis: VR participants slightly overestimated (39% vs 36% actual), suggesting VR made the destruction feel more severe. In contrast, 2D participants significantly underestimated (27% vs 36%), with 10 out of 11 participants underestimating. VR better captured the true severity of deforestation.
2010 vs 2000 loss (Actual 5.60%):
Average estimate: 4.75%
Error: 1.61% (28.7% relative error)
2020 vs 2000 loss (Actual 28.64%):
Average estimate: 29.09%
Error: 4.92% (17.2% relative error) - BEST VR PERFORMANCE
Median: 30.00% (very close to actual)
2023 vs 2000 loss (Actual 35.84%):
Average estimate: 39.09%
Error: 5.53% (15.4% relative error)
Median: 40.00%
Trees lost 2015→2020 (Actual 117):
Average estimate: 118.6 trees
Error: 11.5 trees (9.8% relative error)
Median: 117.0 trees (EXACT)
Key Finding: VR performed best on the dramatic 2020 period (28.64% actual loss) with only 17.2% relative error. This was the period with the steepest acceleration (2015-2020: 18.72% loss in 5 years), suggesting VR excels at conveying dramatic changes.
2005 vs 2000 loss (Actual 1.92%):
Average estimate: 1.18%
Error: 0.83% (43.0% relative error) - WORST 2D PERFORMANCE
2015 vs 2000 loss (Actual 9.92%):
Average estimate: 7.73%
Error: 4.07% (41.0% relative error)
2023 vs 2000 loss (Actual 35.84%):
Average estimate: 26.82%
Error: 9.78% (27.3% relative error)
10 out of 11 participants underestimated
Trees lost 2020→2023 (Actual 45):
Average estimate: 34.5 trees
Error: 10.5 trees (23.2% relative error)
Median: 45.0 trees (EXACT)
Key Finding: 2D struggled most with subtle changes (2005: 43% error) and systematically underestimated the final state. The side-view perspective limited comprehension of total extent.
Accuracy preference:
VR: 100% (11 out of 11 participants)
2D: 0% (0 out of 11 participants)
Impact preference:
VR: 100% (11 out of 11 participants)
2D: 0% (0 out of 11 participants)
Interpretation: Unanimous preference across both metrics. Perfect alignment between objective performance (VR was measurably more accurate) and subjective preference (all participants chose VR).
VR Strengths:
"I liked the birds eye view perspective"
"The ability to move into the forest was particularly effective, as I felt that I could see the deforestation from multiple views"
"Good visuals, easy to maneuver"
"The real controllers in the scene with the overlay instructions was very helpful"
VR Navigation Issues:
"Difficult for rotating around, fly was a bit weird" (multiple mentions)
"Right joystick was very jumpy and disorienting" (3 participants)
"Brown floating rectangle under trees was trippy when viewing things"
"Couldn't figure out how to move back"
"The honed in circle that would occur in the headset was a bit confusing" (tunneling vignette)
2D Visualization Issues:
"2D version was only a side view, so I couldn't really tell anything about how things were changing"
"It felt unfair to compare to 2D-viz given that it was sideways and not really how a typical 2D-viz would look like"
"I think the trees could be tilted a little more, more of an isometric view like google maps"
Improvement Suggestions:
Add percentage overlays (2 mentions)
Fix right joystick rotation sensitivity (3 mentions)
Enable backward movement
Improve 2D comparison with vertical/top-down view (2 mentions)
Zoom in/out feature
Hold-down to scroll through years faster
Automatic time-lapse slider
More scale/broader area
Carbon emissions correlation
Contextual information beyond visualization
This project provides empirical evidence for VR's educational effectiveness in environmental data communication through three key findings:
Quantitative Validation: The 16.3% accuracy improvement is statistically significant (p < 0.05) and demonstrates measurable benefit over traditional 2D approaches.
Controlled Methodology: Within-subjects design controls for individual differences, while the identical question (2023 loss asked in both formats) provides the cleanest comparison with 43.5% VR advantage.
Unanimous Preference: 100% preference rate is exceptionally rare in user experience research, suggesting strong user demand for VR environmental visualization.
Better Comprehension:
Lower average error (4.09% vs 4.89%)
More consistent performance across participants
Superior capture of dramatic changes (2020 period: 17.2% error)
Stronger Emotional Connection:
100% chose VR for "impact"
Participants overestimated severity (39% vs 36%) rather than underestimating like 2D users
Multiple mentions of feeling "immersed" in the experience
Enhanced Engagement:
Active exploration vs passive viewing
Multi-perspective understanding from flying navigation
Temporal navigation empowerment through controller input
This approach can be applied to:
Climate Education: Sea level rise visualization (walk through flooded cities), glacier retreat over time, temperature change mapping in 3D geographic space
Conservation Communication: Wildlife habitat loss, coral reef bleaching progression, urban sprawl impact visualization
Policy & Advocacy: Evidence-based communication to policymakers, public awareness campaigns, fundraising for environmental organizations
Scientific Research: Data exploration for researchers, pattern discovery in spatial-temporal datasets, hypothesis generation through immersive analysis
Test Deployment Pipeline Early: Waiting until Week 5 to attempt Quest deployment nearly derailed the project with Mac SDK issues. Deploy a "Hello World" build to target platform in Week 1.
Mobile Shader Complexity Matters: URP/Lit shaders timeout on Quest. Mobile/Unlit reduced build time 85%. Profile early and optimize shaders for mobile VR from the start.
Input System Conflicts Are Subtle: Mixing old and new Unity Input System caused errors. Commit to new Input System entirely and migrate completely.
Camera Lock Breaks VR: Custom scripts that override camera position will break XR Origin movement. Understand XR systems thoroughly before adding custom movement logic.
VR Navigation Is Critical: Despite superior accuracy, navigation issues frustrated users. The project could have performed even better with polished movement UX. Invest heavily in movement comfort and intuitiveness.
Fair Comparison Requires Fair 2D: The 2D side-view was suboptimal design. Participants explicitly noted the unfair comparison. Use best-possible 2D visualization for valid research comparison.
Subtle Changes Need Help: Both VR and 2D struggled with small differences (2005: 1.92% loss). Percentage overlays would improve both formats. Don't rely solely on visual perception for precise values.
Dramatic Changes Shine in VR: The 2020 period (28.64% loss) had best VR performance because immersion amplifies dramatic changes. VR excels at making magnitude tangible.
Within-Subjects Design Powerful: Controls for individual differences but introduces potential order effects. Use counterbalancing if feasible (half VR first, half 2D first).
Identical Questions Are Gold: The 2023 question asked in both formats provided the cleanest comparison (43.5% improvement) with no confounding variables. Always include at least one identical question across conditions.
Qualitative Feedback Essential: Open-ended responses explained the quantitative results and revealed unexpected issues like tunneling vignette disorientation. Always collect qualitative feedback alongside metrics.
Preference and Performance Alignment: When subjective preference matches objective performance (both favoring VR), the evidence is extremely strong. This alignment is rare in UX research.
Navigation Refinement:
Fix right joystick sensitivity for smooth rotation
Add backward movement capability
Reduce movement speed from 50 to 10-15 units/sec
Disable or reduce tunneling vignette effect
Add in-game speed adjustment slider
Visual Enhancements:
Add optional percentage overlays for precision
Implement smooth time-lapse animation between years
Add minimap showing user position
Include ground texture beyond flat plane
2D Visualization Redesign:
Switch from side-view to top-down or isometric
Enable zoom and pan
Match VR's year navigation UX
Re-run comparison study with improved 2D
Expanded Geographic Coverage:
Add 5-10 additional Rondônia tiles
Create seamless larger forest environment
Enable teleportation between regions
Comparative view of multiple regions side-by-side
Enhanced Data Integration:
Import tree cover gain data (reforestation)
Add biodiversity impact indicators
Integrate carbon emissions data
Show correlation with deforestation drivers
Educational Features:
Guided tours with narration
Quiz mode with scoring
Before/after comparison slider
Save and share user perspectives
Research Extensions:
Larger sample size (n=50+)
Counterbalanced within-subjects design
Delayed retention testing
Behavioral outcome measurement
Cross-age group comparison studies
Quest APK:
File: AmazonVR.apk
Size: ~150 MB
Requirements: Meta Quest 1/2/3
Installation: Sideload via SideQuest or ADB
Source Code:
Unity Version: 6.3 LTS
Includes: All scripts, prefabs, and data processing code
License: MIT
Data Processing Pipeline:
Python scripts for GeoTIFF to PNG conversion
Requires: rasterio, numpy, Pillow
Input: Hansen GFC GeoTIFF
Output: Unity-ready PNG files
User Study Materials:
Google Forms template
Excel analysis spreadsheet with formulas
Study protocol documentation
Hansen, M. C., et al. (2013). High-resolution global maps of 21st-century forest cover change. Science, 342(6160), 850-853.
Global Forest Watch. (2024). Hansen Global Forest Change v1.11 (2000-2023). University of Maryland, NASA, USGS.
Unity Technologies. (2024). XR Interaction Toolkit Documentation.