28477

Monadic Deep Learning

Bo Yang, Zhihao Zhang Kirisame Marisa, Kai Shi
ThoughtWorks, Inc
arXiv:2307.12187 [cs.PL], (23 Jul 2023)

@misc{yang2023monadic,

   title={Monadic Deep Learning},

   author={Bo Yang and Zhihao Zhang Kirisame Marisa and Kai Shi},

   year={2023},

   eprint={2307.12187},

   archivePrefix={arXiv},

   primaryClass={cs.PL}

}

The Java and Scala community has built a very successful big data ecosystem. However, most of neural networks running on it are modeled in dynamically typed programming languages. These dynamically typed deep learning frameworks treat neural networks as differentiable expressions that contain many trainable variable, and perform automatic differentiation on those expressions when training them. Until 2019, none of the learning frameworks in statically typed languages provided the expressive power of traditional frameworks. Their users are not able to use custom algorithms unless creating plenty of boilerplate code for hard-coded back-propagation. We solved this problem in DeepLearning.scala 2. Our contributions are: 1. We discovered a novel approach to perform automatic differentiation in reverse mode for statically typed functions that contain multiple trainable variable, and can interoperate freely with the metalanguage. 2. We designed a set of monads and monad transformers, which allow users to create monadic expressions that represent dynamic neural networks. 3. Along with these monads, we provide some applicative functors, to perform multiple calculations in parallel. With these features, users of DeepLearning.scala were able to create complex neural networks in an intuitive and concise way, and still maintain type safety.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: