25211

StreamBrain: An HPC Framework for Brain-like Neural Networks on CPUs, GPUs and FPGAs

Artur Podobas, Martin Svedin, Steven W. D. Chien, Ivy B. Peng, Naresh Balaji Ravichandran, Pawel Herman, Anders Lansner, Stefano Markidis
KTH Royal Institute of Technology, Stockholm, Sweden
arXiv:2106.05373 [cs.DC], (9 Jun 2021)

@misc{podobas2021streambrain,

   title={StreamBrain: An HPC Framework for Brain-like Neural Networks on CPUs, GPUs and FPGAs},

   author={Artur Podobas and Martin Svedin and Steven W. D. Chien and Ivy B. Peng and Naresh Balaji Ravichandran and Pawel Herman and Anders Lansner and Stefano Markidis},

   year={2021},

   eprint={2106.05373},

   archivePrefix={arXiv},

   primaryClass={cs.DC}

}

The modern deep learning method based on backpropagation has surged in popularity and has been used in multiple domains and application areas. At the same time, there are other — less-known — machine learning algorithms with a mature and solid theoretical foundation whose performance remains unexplored. One such example is the brain-like Bayesian Confidence Propagation Neural Network (BCPNN). In this paper, we introduce StreamBrain — a framework that allows neural networks based on BCPNN to be practically deployed in High-Performance Computing systems. StreamBrain is a domain-specific language (DSL), similar in concept to existing machine learning (ML) frameworks, and supports backends for CPUs, GPUs, and even FPGAs. We empirically demonstrate that StreamBrain can train the well-known ML benchmark dataset MNIST within seconds, and we are the first to demonstrate BCPNN on STL-10 size networks. We also show how StreamBrain can be used to train with custom floating-point formats and illustrate the impact of using different bfloat variations on BCPNN using FPGAs.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: