By Arib Syed
These pages go through how to utilize Unity to create ML Embedding Visualizations in VR/AR.
**Network‑Flow Visualization in AR**
To extend ML embedding visualizations to infrastructure planning, we built a module that treats transit networks as embeddings in 3D space and uses AR to convey flow magnitudes. Key elements:
- **Construct a graph**: Represent each station or stop as a node and each segment between stops as an edge, storing attributes like length, travel time and ridership.
- **Generate embeddings & geometry**: Use geographic coordinates or a multidimensional scaling algorithm to position nodes in 3D space. Instantiate Unity `LineRenderer` components or tube meshes for each edge, with thickness and color mapped to ridership values or network flow.
- **Animate flows**: Add moving particles or animated gradients along edges to indicate direction and relative throughput. Support time‑series data to show rush hour vs off‑peak patterns.
- **Enable user interaction**: Provide UI sliders or dropdowns to filter flows by line, direction or threshold; allow toggling between multiple network scenarios (e.g., baseline vs proposed expansion); and display tooltips when hovering over edges or nodes.
This module demonstrates how data‑driven AR can reveal complex flow patterns in transportation networks, complementing the word‑embedding visualizations previously explored on this page.
Contributed by Korey