fairseq: A Fast, Extensible Toolkit for Sequence Modeling
Facebook AI Research
arXiv:1904.01038 [cs.CL], (1 Apr 2019)
@misc{ott2019fairseq,
title={fairseq: A Fast, Extensible Toolkit for Sequence Modeling},
author={Myle Ott and Sergey Edunov and Alexei Baevski and Angela Fan and Sam Gross and Nathan Ng and David Grangier and Michael Auli},
year={2019},
eprint={1904.01038},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
fairseq is an open-source sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling, and other text generation tasks. The toolkit is based on PyTorch and supports distributed training across multiple GPUs and machines. We also support fast mixed-precision training and inference on modern GPUs. A demo video can be found below.
April 7, 2019 by hgpu