fairseq-lua
Facebook AI Research Sequence-to-Sequence Toolkit
...The framework implements sequence-to-sequence models with attention, beam search decoding, and distributed training, providing a research platform for exploring translation, summarization, and language modeling. Its modular design made it easy to prototype new architectures by modifying encoders, decoders, or attention mechanisms. Although now deprecated in favor of the PyTorch rewrite, fairseq-lua played a key role in advancing large-scale NMT systems, such as early versions of Facebook’s production translation models. It remains an important historical reference for neural sequence learning frameworks.