Accelerating Winograd Convolutions using Symbolic Computation and Meta-programming

Arya Mazaheri, Tim Beringer, Matthew Moskewicz, Felix Wolf, Ali Jannesari
Technical University of Darmstadt, Department of Computer Science, Germany
Proceedings of the Fifteenth European Conference on Computer Systems (EuroSys ’20), 2020


   title={Accelerating winograd convolutions using symbolic computation and meta-programming},

   author={Mazaheri, Arya and Beringer, Tim and Moskewicz, Matthew and Wolf, Felix and Jannesari, Ali},

   booktitle={Proceedings of the Fifteenth European Conference on Computer Systems},




Download Download (PDF)   View View   Source Source   



Convolution operations are essential constituents of convolutional neural networks. Their efficient and performance-portable implementation demands tremendous programming effort and fine-tuning. Winograd’s minimal filtering algorithm is a well-known method to reduce the computational complexity of convolution operations. Unfortunately, existing implementations of this algorithm are either vendor-specific or hard-coded to support a small subset of convolutions, thus limiting their versatility and performance portability. In this paper, we propose a novel method to optimize Winograd convolutions based on symbolic computation. Taking advantage meta-programming and auto-tuning, we further introduce a system to automate the generation of efficient and portable Winograd convolution code for various GPUs. We show that our optimization technique can effectively exploit repetitive patterns, enabling us to reduce the number of arithmetic operations by up to 62% without compromising numerical stability. Moreover, we demonstrate in experiments that we can generate efficient kernels with runtimes close to deep-learning libraries, requiring only a minimum of programming effort, which confirms the performance portability of our approach.
No votes yet.
Please wait...

* * *

* * *

* * *

HGPU group © 2010-2022 hgpu.org

All rights belong to the respective authors

Contact us: