17608

FPGA implementation of a Convolutional Neural Network for "Wake up word" detection

Ole Martin Skafsa
Norwegian University of Science and Technology, Department of Electronic Systems
Norwegian University of Science and Technology, 2017

@mastersthesis{skafsaa2017fpga,

   title={FPGA implementation of a Convolutional Neural Network for" Wake up word" detection},

   author={Skafs{aa}, Ole Martin},

   year={2017},

   school={NTNU}

}

Download Download (PDF)   View View   Source Source   

8303

views

The popularity of machine learning has increased dramatically in the last years and the possible applications varies from web search, speech recognition, object detection, etc. A big part of this development is due to the use of Convolutional Neural Networks (CNNs), where high performance Graphics Processing Units (GPUs) has been the most popular device. This thesis explores the use of a Field-Programmable Gate Array (FPGA), specifically an Arria 10 GX FPGA, to implement a "wake up word" CNN. The High-Level Synthesis (HLS) tool Intel FPGA SDK for OpenCL was used. During the project various neural networks has been implemented and tested on the FPGA with different attributes to understand their effect. An infrastructure to test various neural networks was made and used to implement the wake up word CNN. A solution to test the CNN in a setup with live recording was also made. The final implementation of the wake up word CNN achieved a classification time of 3.6 ms and 0.54 Gmac/s, where a mac is the multiply-accumulate operation. Comparing to a CNN runnning on a NVIDIA Tegra X1 GPU, the GPU was 22.2 times faster with 11.99 Gmac/s. Although the classification time of 3.6 ms is acceptable for this application, future work should attempt to keep as much of the computation and memory transfers on the FPGA chip and with minimal interaction with the host machine to improve performance.
Rating: 4.0. From 1 vote.
Please wait...

* * *

* * *

HGPU group © 2010-2017 hgpu.org

All rights belong to the respective authors

Contact us: