29180

A Systematic Literature Survey of Sparse Matrix-Vector Multiplication

Jianhua Gao, Bingjie Liu, Weixing Ji, Hua Huang
School of Artificial Intelligence, Beijing Normal University, Beijing 100875, China
arXiv:2404.06047 [cs.DC], (9 Apr 2024)

@misc{gao2024systematic,

   title={A Systematic Literature Survey of Sparse Matrix-Vector Multiplication},

   author={Jianhua Gao and Bingjie Liu and Weixing Ji and Hua Huang},

   year={2024},

   eprint={2404.06047},

   archivePrefix={arXiv},

   primaryClass={cs.DC}

}

Download Download (PDF)   View View   Source Source   

1101

views

Sparse matrix-vector multiplication (SpMV) is a crucial computing kernel with widespread applications in iterative algorithms. Over the past decades, research on SpMV optimization has made remarkable strides, giving rise to various optimization contributions. However, the comprehensive and systematic literature survey that introduces, analyzes, discusses, and summarizes the advancements of SpMV in recent years is currently lacking. Aiming to fill this gap, this paper compares existing techniques and analyzes their strengths and weaknesses. We begin by highlighting two representative applications of SpMV, then conduct an in-depth overview of the important techniques that optimize SpMV on modern architectures, which we specifically classify as classic, auto-tuning, machine learning, and mixed-precision-based optimization. We also elaborate on the hardware-based architectures, including CPU, GPU, FPGA, processing in Memory, heterogeneous, and distributed platforms. We present a comprehensive experimental evaluation that compares the performance of state-of-the-art SpMV implementations. Based on our findings, we identify several challenges and point out future research directions. This survey is intended to provide researchers with a comprehensive understanding of SpMV optimization on modern architectures and provide guidance for future work.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: