Megatron
Ongoing research training transformer models at scale
Megatron is a large, powerful transformer developed by the Applied Deep Learning Research team at NVIDIA. This repository is for ongoing research on training large transformer language models at scale. We developed efficient, model-parallel (tensor, sequence, and pipeline), and multi-node pre-training of transformer based models such as GPT, BERT, and T5 using mixed precision.
Megatron is also used in NeMo Megatron, a framework to help enterprises overcome the challenges of building...