28226

Optimizing Deep Learning Models For Raspberry Pi

Salem Ameen, Kangaranmulle Siriwardana, Theo Theodoridis
School of Science, Engineering and Environment, University of Salford, Manchester, United Kindom
arXiv:2304.13039 [eess.SY], (25 Apr 2023)

@misc{ameen2023optimizing,

   title={Optimizing Deep Learning Models For Raspberry Pi},

   author={Salem Ameen and Kangaranmulle Siriwardana and Theo Theodoridis},

   year={2023},

   eprint={2304.13039},

   archivePrefix={arXiv},

   primaryClass={eess.SY}

}

Deep learning models have become increasingly popular for a wide range of applications, including computer vision, natural language processing, and speech recognition. However, these models typically require large amounts of computational resources, making them challenging to run on low-power devices such as the Raspberry Pi. One approach to addressing this challenge is to use pruning techniques to reduce the size of the deep learning models. Pruning involves removing unimportant weights and connections from the model, resulting in a smaller and more efficient model. Pruning can be done during training or after the model has been trained. Another approach is to optimize the deep learning models specifically for the Raspberry Pi architecture. This can include optimizing the model’s architecture and parameters to take advantage of the Raspberry Pi’s hardware capabilities, such as its CPU and GPU. Additionally, the model can be optimized for energy efficiency by minimizing the amount of computation required. Pruning and optimizing deep learning models for the Raspberry Pi can help overcome the computational and energy constraints of low-power devices, making it possible to run deep learning models on a wider range of devices. In the following sections, we will explore these approaches in more detail and discuss their effectiveness for optimizing deep learning models for the Raspberry Pi.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: