Color and motion-based particle filter target tracking in a network of overlapping cameras with multi-threading and GPGPU
Computer Science Group, Centro de Investigacion en Matematicas. Jalisco S/N. Valenciana, Guanajuato, Gto., Mexico
Acta Universitaria, Vol. 23(1), 2013
@article{madrigal2013color,
title={Color and motion-based particle filter target tracking in a network of overlapping cameras with multi-threading and GPGPU},
author={Madrigal, Francisco and Hayet, Jean-Bernard},
journal={Acta Universitaria},
volume={23},
number={1},
pages={9–16},
year={2013},
publisher={Universidad de Guanajuato}
}
This paper describes an efficient implementation of multiple-target multiple-view tracking in video-surveillance sequences. It takes advantage of the capabilities of multiple core Central Processing Units (CPUs) and of graphical processing units under the Compute Unified Device Architecture (CUDA) framework. The principle of our algorithm is 1) in each video sequence, to perform tracking on all persons to track by independent particle filters and 2) to fuse the tracking results of all sequences. Particle filters belong to the category of recursive Bayesian filters. They update a Monte-Carlo representation of the posterior distribution over the target position and velocity. For this purpose, they combine a probabilistic motion model, i.e. prior knowledge about how targets move (e.g. constant velocity) and a likelihood model associated to the observations on targets. At this first level of single video sequences, the multi-threading library Threading Buildings Blocks (TBB) has been used to parallelize the processing of the per target independent particle filters. Afterwards at the higher level, we rely on General Purpose Programming on Graphical Processing Units (generally termed as GPGPU) through CUDA in order to fuse target-tracking data collected on multiple video sequences, by solving the data association problem. Tracking results are presented on various challenging tracking datasets.
May 7, 2013 by hgpu