DTAM: Dense tracking and mapping in real-time

Richard A. Newcombe, Steven J. Lovegrove, Andrew J. Davison
Department of Computing, Imperial College London, UK
IEEE International Conference on Computer Vision (ICCV 2011), 2011


   title={DTAM: Dense tracking and mapping in real-time},

   author={Newcombe, R.A. and Lovegrove, S. and Davison, A.J.},

   booktitle={Proc. of the Intl. Conf. on Computer Vision (ICCV), Barcelona, Spain},




Download Download (PDF)   View View   Source Source   



DTAM is a system for real-time camera tracking and reconstruction which relies not on feature extraction but dense, every pixel methods. As a single hand-held RGB camera flies over a static scene, we estimate detailed textured depth maps at selected keyframes to produce a surface patchwork with millions of vertices. We use the hundreds of images available in a video stream to improve the quality of a simple photometric data term, and minimise a global spatially regularised energy functional in a novel non-convex optimisation framework. Interleaved, we track the camera 6DOF motion precisely by frame-rate whole image alignment against the entire dense model. Our algorithms are highly parallelisable throughout and DTAM achieves real-time performance using current commodity GPU hardware. We demonstrate that a dense model permits superior tracking performance under rapid motion compared to a state of the art method using features; and also show the additional usefulness of the dense model for real-time scene interaction in a physics-enhanced augmented reality application.
Rating: 2.4/5. From 4 votes.
Please wait...

* * *

* * *

HGPU group © 2010-2021 hgpu.org

All rights belong to the respective authors

Contact us: