T5-Small is a lightweight variant of the Text-To-Text Transfer Transformer (T5), designed to handle a wide range of NLP tasks using a unified text-to-text approach. Developed by researchers at Google, this model reframes all tasks—such as translation, summarization, classification, and question answering—into the format of input and output as plain text strings. With only 60 million parameters, T5-Small is compact and suitable for fast inference or deployment in constrained environments. It was pretrained on the C4 dataset using both unsupervised denoising and supervised learning on tasks like sentiment analysis, NLI, and QA. Despite its size, it performs competitively across 24 NLP benchmarks, making it a strong candidate for prototyping and fine-tuning. T5-Small is compatible with major deep learning frameworks including PyTorch, TensorFlow, JAX, and ONNX. The model is open-source under the Apache 2.0 license and has wide support across Hugging Face's ecosystem.
Features
- Text-to-text format for all NLP tasks
- Supports translation, summarization, QA, and classification
- Trained on C4 with diverse supervised datasets
- 60M parameters for faster and lightweight inference
- Multi-framework compatibility (PyTorch, TF, JAX, ONNX)
- Pretrained with both denoising and supervised objectives
- Easily fine-tuned for downstream tasks
- Licensed under Apache 2.0 for flexible use