High speed view interpolation for tele-teaching and tele-conferencing
ESAT/PSI-VISICS, Katholieke Universiteit, Leuven, Belgium
Proceedings of 2004 International Symposium on Intelligent Multimedia, Video and Speech Processing, 2004
@inproceedings{geys2004high,
title={High speed view interpolation for tele-teaching and tele-conferencing},
author={Geys, I. and Van Gool, L.},
booktitle={Intelligent Multimedia, Video and Speech Processing, 2004. Proceedings of 2004 International Symposium on},
pages={430–433},
year={2004},
organization={IEEE}
}
This paper presents an algorithm to generate an interpolated view between two camera viewpoints in a fast and automatic way (6-7 fps on a PentIV @ 2.6 GHz, Geforce FX AGP 4). Nothing more than a desktop PC and a set of low end consumer grade cameras are needed to simulate the video stream of any intermediate camera. Parallel use of the GPU (‘plane sweep’ algorithm) and the CPU (‘min-cut/max-flow’ regularisation algorithm) is made to calculate the depth values. The final interpolations for any intermediate camera position are obtained by a projectively correct blended warp of the input images on a 3D mesh. Limited extrapolation is also feasible. The goal is to develop more advanced tele-teaching and videoconferencing environments, and this without the need of many cameras. Camera movements can be simulated and the best view can be selected whether this is recorded by a real camera or not. Compared to putting a human editor in control, the cost decreases dramatically, without losing all the added value of video stream editing.
September 4, 2011 by hgpu