Scalability Analysis of Synchronous Data-Parallel Artificial Neural Network (ANN) Learners
Virginia Tech
Virginia Tech, 2018
@phdthesis{sun2018scalability,
title={Scalability Analysis of Synchronous Data-Parallel Artificial Neural Network (ANN) Learners},
author={Sun, Chang},
year={2018},
school={Virginia Tech}
}
Artificial Neural Networks (ANNs) have been established as one of the most important algorithmic tools in the Machine Learning (ML) toolbox over the past few decades. ANNs’ recent rise to widespread acceptance can be attributed to two developments: (1) the availability of large-scale training and testing datasets; and (2) the availability of new computer architectures for which ANN implementations are orders of magnitude more efficient. In this thesis, I present research on two aspects of the second development. First, I present a portable, open source implementation of ANNs in OpenCL and MPI. Second, I present performance and scaling models for ANN algorithms on state-of-the-art Graphics Processing Unit (GPU) based parallel compute clusters.
September 23, 2018 by hgpu