17526

Towards On-Chip Optical FFTs for Convolutional Neural Networks

Jonathan George, Hani Nejadriahi, Volker Sorger
Department of Electrical and Computer Engineering, The George Washington University, Washington, D.C., USA
arXiv:1708.09534 [cs.ET], (31 Aug 2017)

@article{george2017towards,

   title={Towards On-Chip Optical FFTs for Convolutional Neural Networks},

   author={George, Jonathan and Nejadriahi, Hani and Sorger, Volker},

   year={2017},

   month={aug},

   archivePrefix={"arXiv"},

   primaryClass={cs.ET}

}

Download Download (PDF)   View View   Source Source   

3426

views

Convolutional neural networks have become an essential element of spatial deep learning systems. In the prevailing architecture, the convolution operation is performed with Fast Fourier Transforms (FFT) electronically in GPUs. The parallelism of GPUs provides an efficiency over CPUs, however both approaches being electronic are bound by the speed and power limits of the interconnect delay inside the circuits. Here we present a silicon photonics based architecture for convolutional neural networks that harnesses the phase property of light to perform FFTs efficiently. Our all-optical FFT is based on nested Mach-Zender Interferometers, directional couplers, and phase shifters, with backend electro-optic modulators for sampling. The FFT delay depends only on the propagation delay of the optical signal through the silicon photonics structures. Designing and analyzing the performance of a convolutional neural network deployed with our on-chip optical FFT, we find dramatic improvements by up to 10^4 when compared to state-of-the-art GPUs when exploring a compounded figure-of-merit given by power per convolution over area. At a high level, this performance is enabled by mapping the desired mathematical function, an FFT, synergistically onto hardware, in this case optical delay interferometers.
Rating: 2.0/5. From 1 vote.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: