A Short Note on Gaussian Process Modeling for Large Datasets using Graphics Processing Units
Department of Mathematics and Statistics, Acadia University, NS, Canada
Acadia University, 2011
@article{franey2011short,
title={A Short Note on Gaussian Process Modeling for Large Datasets using Graphics Processing Units},
author={Franey, M. and Ranjan, P. and Chipman, H.},
year={2011}
}
The graphics processing unit (GPU) has emerged as a powerful and cost effective processor for high performance computing. GPUs are capable of an order of magnitude more floating point operations per second as compared to modern central processing units (CPUs), and thus provide a great deal of promise for computationally intensive statistical applications (Brodtkorb et al. 2010). Fitting complex statistical models with a large number of parameters and/or for large datasets is often computationally very expensive. In this study, we focus on Gaussian process (GP) models – statistical models commonly used for emulating computer simulators that are either too expensive or time consuming to observe (Sacks et al. 1989). We demonstrate that the computational cost of implementing the GP models can significantly be reduced by using a CPU+GPU heterogeneous computing system over an analogous implementation on a traditional computing system without GPU acceleration (i.e., CPU only).
November 26, 2011 by hgpu