57
IRUS TotalDownloads
Altmetric
CodeMapping: real-time dense mapping for sparse SLAM using compact scene representations
File | Description | Size | Format | |
---|---|---|---|---|
2107.08994.pdf | Accepted version | 9.64 MB | Adobe PDF | View/Open |
Title: | CodeMapping: real-time dense mapping for sparse SLAM using compact scene representations |
Authors: | Matsuki, H Scona, R Czarnowski, J Davison, AJ |
Item Type: | Journal Article |
Abstract: | We propose a novel dense mapping framework for sparse visual SLAM systems which leverages a compact scene representation. State-of-the-art sparse visual SLAM systems provide accurate and reliable estimates of the camera trajectory and locations of landmarks. While these sparse maps are useful for localization, they cannot be used for other tasks such as obstacle avoidance or scene understanding. In this letter we propose a dense mapping framework to complement sparse visual SLAM systems which takes as input the camera poses, keyframes and sparse points produced by the SLAM system and predicts a dense depth image for every keyframe. We build on CodeSLAM [1] and use a variational autoencoder (VAE) which is conditioned on intensity, sparse depth and reprojection error images from sparse SLAM to predict an uncertainty-aware dense depth map. The use of a VAE then enables us to refine the dense depth images through multi-view optimization which improves the consistency of overlapping frames. Our mapper runs in a separate thread in parallel to the SLAM system in a loosely coupled manner. This flexible design allows for integration with arbitrary metric sparse SLAM systems without delaying the main SLAM process. Our dense mapper can be used not only for local mapping but also globally consistent dense 3D reconstruction through TSDF fusion. We demonstrate our system running with ORB-SLAM3 and show accurate dense depth estimation which could enable applications such as robotics and augmented reality. |
Issue Date: | 1-Oct-2021 |
Date of Acceptance: | 28-Jun-2021 |
URI: | http://hdl.handle.net/10044/1/91715 |
DOI: | 10.1109/LRA.2021.3097258 |
ISSN: | 2377-3766 |
Publisher: | Institute of Electrical and Electronics Engineers |
Start Page: | 7105 |
End Page: | 7112 |
Journal / Book Title: | IEEE Robotics and Automation Letters |
Volume: | 6 |
Issue: | 4 |
Copyright Statement: | © 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
Sponsor/Funder: | Dyson Technology Limited Engineering & Physical Science Research Council (EPSRC) |
Funder's Grant Number: | PO 4500501004 EP/S036636/1 |
Keywords: | Science & Technology Technology Robotics SLAM mapping vision-based navigation Science & Technology Technology Robotics SLAM mapping vision-based navigation 0913 Mechanical Engineering |
Publication Status: | Published |
Online Publication Date: | 2021-07-14 |
Appears in Collections: | Computing |