15601

Recurrent neural networks for language modeling

Emil Sauer Lynge
Technical University of Denmark, Department of Applied Mathematics and Computer Science, Matematiktorvet, building 303B, 2800 Kongens Lyngby, Denmark
Kongens Lyngby, 2016

@article{lynge2016recurrent,

   title={Recurrent neural networks for language modeling},

   author={Lynge, Emil Sauer},

   year={2016}

}

The goal of the thesis is to explore the mechanisms and tools that enables efficient development of Recurrent Neural Networks, how to train them and what they can accomplish in regard to character level language modelling. Specifically Gated Recurrence Units and Long Short Term Memory are the focal point of the training and language modelling. Choice of data sets, hyper parameters and visualization methods, aims to reproduce parts of [KJL15]. More broadly RNN as a concept is explored through computational graphs and back propagation. Several concrete software tools written in python 3 is developed as part of the project, and discussed briefly in the thesis.
Rating: 2.3/5. From 2 votes.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: