GPU support for batch oriented workloads
Electr. & Comput. Eng. Dept., Univ. of British Columbia, Vancouver, BC, Canada
In 2009 IEEE 28th International Performance Computing and Communications Conference (December 2009), pp. 231-238.
@conference{costa2010gpu,
title={GPU support for batch oriented workloads},
author={Costa, L.B. and Al-Kiswany, S. and Ripeanu, M.},
booktitle={Performance Computing and Communications Conference (IPCCC), 2009 IEEE 28th International},
pages={231–238},
year={2010},
organization={IEEE}
}
This paper explores the ability to use graphics processing units (GPUs) as co-processors to harness the inherent parallelism of batch operations in systems that require high performance. To this end we have chosen bloom filters (space-efficient data structures that support the probabilistic representation of set membership) as the queries these data structures support are often performed in batches. Bloom filters exhibit low computational cost per amount of data, providing a baseline for more complex batch operations. We implemented BloomGPU a library that supports offloading bloom filter support to the GPU and evaluate this library under realistic usage scenarios. By completely offloading Bloom filter operations to the GPU, BloomGPU outperforms an optimized CPU implementation of the bloom filter as the workload becomes larger.
October 28, 2010 by hgpu