t5-base
Flexible text-to-text transformer model for multilingual NLP tasks
...With 220 million parameters, it can handle a wide range of tasks, including translation, summarization, question answering, and classification. Unlike traditional models like BERT, which output class labels or spans, T5 always generates text outputs. It was trained on the C4 dataset, along with a variety of supervised NLP benchmarks, using both unsupervised denoising and supervised objectives. The model supports multiple languages, including English, French, Romanian, and German. Its flexible architecture and consistent input/output format simplify model reuse and transfer learning across different NLP tasks. ...