Skip to content

Official repository of the paper "Spatial-Temporal Calibration for Outdoor Location-Based Augmented Reality" authored by Lorenzo Orlandi, Kevin Depedri and Nicola Conci

Notifications You must be signed in to change notification settings

Lorenzoarc/Spatial-Temporal-Calibration-for-Outdoor-Location-Based-Augmented-Reality

 
 

Repository files navigation

Spatial-Temporal Calibration for Outdoor Location-Based Augmented Reality

by Lorenzo Orlandi, Kevin Depedri, Nicola Conci

This work is the result of the collaboration between Arcoda s.r.l. and the group MMLab

This research is supported by the project DIMOTY, funded by the Autonomous Province of Trento under the LP6/99 framework

This paper has been accepted at IEEE Sensor Journal 2024.

In this paper we tackle the problem of accuracy for AR solutions developed using the location-based approach, where the pose of the AR contents that need to be visualized is expressed using geographic coordinates. More in detail, we propose a novel procedure to perform the spatial calibration between the smartphone used to visualize AR contents and an external GNSS-RTK receiver, used to acquire the position of the user in the world with centimeter-accuracy. Later, we face all the issue related to the temporal calibration between the two devices, proposing an experimental solution to model the delay between the devices and to mitigate it. Finally, the entire pipeline used to visualize AR contents depending on their geographical coordinates is described, explaining how each part of it works.

Abstract

The 3D digitalization of contents and their visualization using Augmented Reality (AR) has gained a significant interest within the scientific community. Researchers from various fields have recognized the potential of these technologies and have been actively exploring their applications and implications. The potential lies in the ability to provide users with easy access to digitized information by seamlessly integrating contents directly into their field of view. One of the most promising approaches for outdoor scenarios is the so-called location-based AR, where contents are displayed by leveraging satellite positioning (as GNSS) combined with inertial (as IMUs) sensors. Although the number of application fields are numerous, the accuracy of the over-imposition of the additional contents still hinders a widespread adoption of such technologies. In this paper we propose the combination of a GNSS device equipped with real-time kinematic position (RTK), and a regular smartphone, implementing a novel offline calibration process that relies on a motion capture system (MoCap). The proposed solution is capable to ensure the temporal consistency, and allows for real-time acquisition at centimeter-level accuracy.

Video results

The first two videos attached to this repository show the impact of the spatial calibration and of the temporal calibration on the visualization of holograms. The third video shows how it is possible to acquire a geo-referenced dataset by reversing the pipeline discussed in the paper. Later, such geo-referenced dataset can be used to reconstruct the 3D model of the acquired scene, which, thanks to the camera pose of the images is inclusive of its scale information and of the pose to geo-localise it. Finally, the fourth video shows how the reconstructed model can be visualized in AR exploiting the pipeline previously defined.

Press one of the following pictures to be redirected to the video

Effect of the spatial calibration

Spatial callibration

Effect of the temporal calibration

Temporal calibration

Computation of the angular error

Angular Error Computation

Acquisition of geo-referenced images to build geo-referenced 3D models

Dataset acquisition

Visualization of 3D reconstructed models

3D model visualization

Software implementation

All source code used to generate the results and figures in the paper can be find in the folders of this repository. More in detail, the script used to compute the calibration matrix between the GNSS-RTK receiver and the Smartphone is located in the folder spatial_calibration. In the same way the notebooks used to model the temporal delay between the two devices can be find in the folder temporal_calibration. The folder general_data_processing encompasses some other notebooks used to process initial data coming from the two devices and are not discussed in the paper.

The calculations and figure generation are all run inside Jupyter notebooks or python scripts powered by the PyCharm IDE. The data used for each calibration procedure is present in the relative files subfolder.

Getting the code

You can download a copy of all the files in this repository by cloning the git repository:

git clone https://github.com/KevinDepedri/Spatial-Temporal-Calibration-for-Outdoor-Location-Based-Augmented-Reality

or download a zip archive.

Dependencies

You'll need a working Python environment to run the code. The required dependencies are specified in the file requirements.txt.

Thus, you can install our dependencies running the following command in the repository folder (where requirements.txtis located) to create a separate environment and install all required dependencies in it:

python -m venv venv
venv/Scripts/pip install -r requirements.txt

Contacts

Lorenzo Orlandi - lorenzo.orlandi@{arcoda.it,unitn.it}

Kevin Depedri - {kevin.depedri}@studenti.unitn.it

Nicola Conci - {nicola.conci}@unitn.it

About

Official repository of the paper "Spatial-Temporal Calibration for Outdoor Location-Based Augmented Reality" authored by Lorenzo Orlandi, Kevin Depedri and Nicola Conci

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 99.8%
  • Python 0.2%