29704

Good things come in small packages: Should we adopt Lite-GPUs in AI infrastructure?

Burcu Canakci, Junyi Liu, Xingbo Wu, Nathanaël Cheriere, Paolo Costa, Sergey Legtchenko, Dushyanth Narayanan, Ant Rowstron
Microsoft Research
arXiv:2501.10187 [cs.AR], (17 Jan 2025)

@misc{canakci2025goodthingscomesmall,

   title={Good things come in small packages: Should we adopt Lite-GPUs in AI infrastructure?},

   author={Burcu Canakci and Junyi Liu and Xingbo Wu and Nathanaël Cheriere and Paolo Costa and Sergey Legtchenko and Dushyanth Narayanan and Ant Rowstron},

   year={2025},

   eprint={2501.10187},

   archivePrefix={arXiv},

   primaryClass={cs.AR},

   url={https://arxiv.org/abs/2501.10187}

}

Download Download (PDF)   View View   Source Source   

276

views

To match the blooming demand of generative AI workloads, GPU designers have so far been trying to pack more and more compute and memory into single complex and expensive packages. However, there is growing uncertainty about the scalability of individual GPUs and thus AI clusters, as state-of-the-art GPUs are already displaying packaging, yield, and cooling limitations. We propose to rethink the design and scaling of AI clusters through efficiently-connected large clusters of Lite-GPUs, GPUs with single, small dies and a fraction of the capabilities of larger GPUs. We think recent advances in co-packaged optics can be key in overcoming the communication challenges of distributing AI workloads onto more Lite-GPUs. In this paper, we present the key benefits of Lite-GPUs on manufacturing cost, blast radius, yield, and power efficiency; and discuss systems opportunities and challenges around resource, workload, memory, and network management.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2025 hgpu.org

All rights belong to the respective authors

Contact us: