Virtual Viewpoint Disparity Estimation and Convergence Check for Real-Time View Synthesis
Gwangju Institute of Science and Technology (GIST), 261 Cheomdan-gwagiro, Buk-gu, Gwangju 500-712, Korea
Advances in Image and Video Technology, ecture Notes in Computer Science, Volume 7087/2012, 121-131, 2012
@article{shin2012virtual,
title={Virtual Viewpoint Disparity Estimation and Convergence Check for Real-Time View Synthesis},
author={Shin, I.Y. and Ho, Y.S.},
journal={Advances in Image and Video Technology},
pages={121–131},
year={2012},
publisher={Springer}
}
In this paper, we propose a new method for real-time disparity estimation and intermediate view synthesis from stereoscopic images. Some 3D video systems employ both the left and right depth images for virtual view synthesis; however, we estimate only one disparity map at a virtual viewpoint. In addition, we utilize hierarchical belief propagation and convergence check methods to find the global solution rapidly. In order to use the virtual viewpoint disparity map for intermediate view synthesis, we build an occlusion map that describes the occlusion information in the virtual viewpoint region of the reference image. We have also implemented the total system using GPU programming to synthesize virtual viewpoint images in real time.
January 29, 2012 by hgpu