9569

FastSpMM: An Efficient Library for Sparse Matrix Matrix Product on GPUs

Gloria Ortega, Francisco Vazquez, Inmaculada Garcia, Ester M. Garzon
Department of Informatics, University of Almeria, Agrifood Campus of International Excellence (CeiA3), Ctra. Sacramento s/n, 04120 Almeria, Spain
The Computer Journal, 2013

@article{ortega2013fastspmm,

   title={FastSpMM: An Efficient Library for Sparse Matrix Matrix Product on GPUs},

   author={Ortega, Gloria and V{‘a}zquez, Francisco and Garc{‘i}a, Inmaculada and Garz{‘o}n, Ester M},

   journal={The Computer Journal},

   year={2013},

   publisher={Br Computer Soc}

}

Download Download (PDF)   View View   Source Source   Source codes Source codes

Package:

2771

views

Sparse matrix matrix (SpMM) multiplication is involved in a wide range of scientific and technical applications. The computational requirements for this kind of operation are enormous, especially for large matrices. This paper analyzes and evaluates a method to efficiently compute the SpMM product in a computing environment that includes graphics processing units (GPUs). Some libraries to compute this matricial operation can be found in the literature. However, our strategy (FastSpMM) outperforms the existing approaches because it combines the use of the ELLPACK-R storage format with the exploitation of the high ratio computation/memory access of the SpMM operation and the overlapping of CPU-GPU communications/computations by Compute Unified Device Architecture streaming computation. In this work, FastSpMM is described and its performance evaluated with regard to the CUSPARSE library (supplied by NVIDIA), which also includes routines to compute SpMM on GPUs. Experimental evaluations based on a representative set of test matrices show that, in terms of performance, FastSpMM outperforms the CUSPARSE routine as well as the implementation of the SpMM as a set of sparse matrix vector products.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: