We are developing the core components of an architecture for collaborative perception using distributed mmWave MIMO radar nodes. A single radar measures range, angle, and radial velocity only, limiting accurate motion estimation and coverage. By networking multiple radars, we enable instantaneous full velocity vector estimation and robust multi-target tracking through one-shot fusion.
"Collaborative sensing" fundamentally expands radar capability — enabling wide-area “cellular-style” coverage, larger effective aperture (higher resolution), and resilience to occlusions and line-of-sight limitations.
A central challenge is calibration: nodes must know their relative pose. We proposed an autocalibration framework based on joint tracking and pose estimation, with a closed-form solution allowing any two nodes observing a common target to determine relative position and orientation. We validated this experimentally using TI AWR2243BOOST radar platforms with human targets.
Building on self-calibration, we developed a Bayesian fusion framework that produces instantaneous position and velocity estimates. Unlike conventional maximum-likelihood fusion, our method remains reliable even in geometrically degenerate configurations.
Impact: This work moves radar sensing from isolated sensors to cooperative perception networks — a key step toward scalable autonomous systems, smart environments, and next-generation wireless sensing infrastructure.
Publications: Asilomar 2024, RadarConf 2025.