We are working with Samsung to pursue research and development on robust centimeter-accurate GNSS-based positioning in urban environments via sensor fusion with vision and radar. The goal of this research is to develop and test a vehicle positioning strategy that is accurate to better than 30 cm at 95% and is available in all drive-able environments and under all weather conditions.


Collaborative sensing and traffic coordination requires vehicles to know and share their own position. Applications such as automated intersection management, tight-formation platooning, and unified processing of sensor data---all involving vehicles of different makes who may not share a common map---will be greatly facilitated by globally-referenced positioning with sub-30-cm accuracy.


Poor weather also motivates high-accuracy absolute positioning. Current automated vehicles depend crucially on lidar or cameras for fine-grained positioning within their local environment.  But these sensing modalities perform poorly in low-visibility conditions such as a snowy whiteout, dense fog, or heavy rain. GNSS receivers operate well in all weather conditions, but only a GNSS receiver whose errors remain under 30 cm 95% of the time could avoid drifting onto a snow-covered road's soft shoulder.


Carrier-phase-based GNSS positioning can meet the most demanding accuracy requirements envisioned for automated and connected vehicles, but has historically been either too expensive or too fragile, except in open areas with a clear view of the overhead satellites, for widespread adoption.  Tight coupling of carrier-phase-based GNSS positioning with automotive- or industrial-grade inertial sensing and vision has the potential to enable inexpensive and robust precise positioning in urban areas.  Further coupling with radar can make the system robust to low-visibility conditions.

whiteout
deepUrbanRoute
radarMap