29622

A survey on FPGA-based accelerator for ML models

Feng Yan, Andreas Koch, Oliver Sinnen
University of Auckland, Auckland, New Zealand
arXiv:2412.15666 [cs.AR], (20 Dec 2024)

@misc{yan2024surveyfpgabasedacceleratorml,

   title={A survey on FPGA-based accelerator for ML models},

   author={Feng Yan and Andreas Koch and Oliver Sinnen},

   year={2024},

   eprint={2412.15666},

   archivePrefix={arXiv},

   primaryClass={cs.AR},

   url={https://arxiv.org/abs/2412.15666}

}

Download Download (PDF)   View View   Source Source   

523

views

This paper thoroughly surveys machine learning (ML) algorithms acceleration in hardware accelerators, focusing on Field-Programmable Gate Arrays (FPGAs). It reviews 287 out of 1138 papers from the past six years, sourced from four top FPGA conferences. Such selection underscores the increasing integration of ML and FPGA technologies and their mutual importance in technological advancement. Research clearly emphasises inference acceleration (81%) compared to training acceleration (13%). Additionally, the findings reveals that CNN dominates current FPGA acceleration research while emerging models like GNN show obvious growth trends. The categorization of the FPGA research papers reveals a wide range of topics, demonstrating the growing relevance of ML in FPGA research. This comprehensive analysis provides valuable insights into the current trends and future directions of FPGA research in the context of ML applications.
No votes yet.
Please wait...

Recent source codes

* * *

* * *

HGPU group © 2010-2025 hgpu.org

All rights belong to the respective authors

Contact us: