27622

Understanding the Impact of Input Entropy on FPU, CPU, and GPU Power

Sridutt Bhalachandra, Brian Austin, Samuel Williams, Nicholas J. Wright
Lawrence Berkeley National Laboratory, Berkeley, USA
arXiv:2212.08805 [cs.DC], (17 Dec 2022)

@misc{https://doi.org/10.48550/arxiv.2212.08805,

   doi={10.48550/ARXIV.2212.08805},

   url={https://arxiv.org/abs/2212.08805},

   author={Bhalachandra, Sridutt and Austin, Brian and Williams, Samuel and Wright, Nicholas J.},

   keywords={Distributed, Parallel, and Cluster Computing (cs.DC), Hardware Architecture (cs.AR), Performance (cs.PF), FOS: Computer and information sciences, FOS: Computer and information sciences, C.1.2; C.1.4; C.4},

   title={Understanding the Impact of Input Entropy on FPU, CPU, and GPU Power},

   publisher={arXiv},

   year={2022},

   copyright={Creative Commons Attribution 4.0 International}

}

Power is increasingly becoming a limiting resource in high-performance, GPU-accelerated computing systems. Understanding the range and sources of power variation is essential in setting realistic bounds on rack and system peak power, and developing techniques that minimize energy. While variations arising during manufacturing and other factors like algorithm among others have been previously studied, this work shows that the program inputs can also severely impact the power consumed not only on the GPU but also CPUs. Power variations of up to 67% were observed on an NVIDIA Ampere A100 GPU for the same algorithm (DGEMM benchmark) and input size with different matrix values. Our investigation shows that the values used as matrix elements, their position, and their uniqueness strongly influence power consumption. The implications of this result on supercomputer performance and energy efficiency are further discussed.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: