A Survey of Convolutional Neural Networks on Edge with Reconfigurable Computing
Instituto Superior de Engenharia de Lisboa, Instituto Politécnico de Lisboa, 1500-335 Lisboa, Portugal
Algorithms, 12(8), 154, 2019
@article{a12080154,
author={Vestias, Mario P.},
title={A Survey of Convolutional Neural Networks on Edge with Reconfigurable Computing},
journal={Algorithms},
volume={12},
year={2019},
number={8},
article-number={154},
url={https://www.mdpi.com/1999-4893/12/8/154},
issn={1999-4893},
doi={10.3390/a12080154}
}
The convolutional neural network (CNN) is one of the most used deep learning models for image detection and classification, due to its high accuracy when compared to other machine learning algorithms. CNNs achieve better results at the cost of higher computing and memory requirements. Inference of convolutional neural networks is therefore usually done in centralized high-performance platforms. However, many applications based on CNNs are migrating to edge devices near the source of data due to the unreliability of a transmission channel in exchanging data with a central server, the uncertainty about channel latency not tolerated by many applications, security and data privacy, etc. While advantageous, deep learning on edge is quite challenging because edge devices are usually limited in terms of performance, cost, and energy. Reconfigurable computing is being considered for inference on edge due to its high performance and energy efficiency while keeping a high hardware flexibility that allows for the easy adaption of the target computing platform to the CNN model. In this paper, we described the features of the most common CNNs, the capabilities of reconfigurable computing for running CNNs, the state-of-the-art of reconfigurable computing implementations proposed to run CNN models, as well as the trends and challenges for future edge reconfigurable platforms.
August 5, 2019 by hgpu