Collaborative Augmented Reality on Smartphones via Life-long City-scale Maps

Lukas Platinsky, Michal Szabados, Filip Hlasek, Ross Hemsley, Luca Del Pero,
Andrej Pancik, Bryan Baum, Hugo Grimmett, Peter Ondruska

Published at 2020 International Symposium on Mixed and Augmented Reality (ISMAR2020)

Abstract

In this paper we present the first published end-to-end production computer-vision system for powering city-scale shared augmented reality experiences on mobile devices. In doing so we propose a new formulation for an experience-based mapping framework as an effective solution to the key issues of city-scale SLAM scalability, robustness, map updates and all-time all-weather performance required by a production system. Furthermore, we propose an effective way of synchronising SLAM systems to deliver seamless real-time localisation of multiple edge devices at the same time. All this in the presence of network latency and bandwidth limitations. The resulting system is deployed and tested at scale in San Francisco where it delivers AR experiences in a mapped area of several hundred kilometers. To foster further development of this area we offer the data set to the public, constituting the largest of this kind to date.

How it works

The system diagram for the proposed system. A large-scale 3D map is computed from a vehicle-based image collection and used to localise camera-equipped edge devices for AR experiences over a mobile network.

The system diagram for the proposed system. A large-scale 3D map is computed from a vehicle-based image collection and used to localise camera-equipped edge devices for AR experiences over a mobile network.

 

City-scale 3D map

The city-scale map built for supporting AR experiences. The subset of the map we evaluated AR performance (middle) consists of 4249 submaps and was computed on a cluster of 500 CPUs over period of 91 hours.

The city-scale map built for supporting AR experiences. The subset of the map we evaluated AR performance (middle) consists of 4249 submaps and was computed on a cluster of 500 CPUs over period of 91 hours.

 

Localisation of mobile phones over the network

The process of continuously localising edge device in the map over the network. Recently captured images are transmitted over the network to determine their global position and compute optimal offset of the local visual-odometry frame of reference.

Applications

 

ISMAR 2020 talk

 
 

Dataset request

 
 

Cite

@inproceedings{platinsky2020collaborative,
  title={Collaborative Augmented Reality on Smartphones via Life-long City-scale Maps},
  author={Platinsky, Lukas and Szabados, Michal and Hlasek, Filip and Hemsley, Ross and Del Pero, Luca and Pancik, Andrej and Baum, Bryan and Grimmett, Hugo and Ondruska, Peter},
  booktitle={2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
  pages={533--541},
  year={2020},
  organization={IEEE}
}

Credits

This work was done at Blue Vision Labs (acquired by Lyft Level 5)

DSC07626.jpg