TheanoLM – An Extensible Toolkit for Neural Network Language Modeling

Seppo Enarvi, Mikko Kurimo
Aalto University, Finland
arXiv:1605.00942 [cs.CL], (3 May 2016)


   title={TheanoLM – An Extensible Toolkit for Neural Network Language Modeling},

   author={Enarvi, Seppo and Kurimo, Mikko},






We present a new tool for training neural network language models (NNLMs), scoring sentences, and generating text. The tool has been written using Python library Theano, which allows researcher to easily extend it and tune any aspect of the training process. Regardless of the flexibility, Theano is able to generate extremely fast native code that can utilize a GPU or multiple CPU cores in order to parallelize the heavy numerical computations. The tool has been evaluated in difficult Finnish and English conversational speech recognition tasks, and significant improvement was obtained over our best back-off n-gram models. The results that we obtained in the Finnish task were compared to those from existing RNNLM and RWTHLM toolkits, and found to be as good or better, while training times were an order of magnitude shorter.
Rating: 2.5/5. From 3 votes.
Please wait...

* * *

* * *

HGPU group © 2010-2021 hgpu.org

All rights belong to the respective authors

Contact us: