Improved DeBERTa model with ELECTRA-style pretraining
Multilingual sentence embeddings for search and similarity tasks
Transformer model for image classification with patch-based input.
RoBERTa model for English sentiment analysis on Twitter data
Protein language model trained for sequence understanding and tasks
Transformer model trained to detect fake vs real tokens efficiently
Zero-shot image-text model for classification and similarity tasks
Compact, efficient model for sentence embeddings and semantic search
GPT-2 is a 124M parameter English language model for text generation
Custom BLEURT model for evaluating text similarity using PyTorch
Robust BERT-based model for English with improved MLM training
Flexible text-to-text transformer model for multilingual NLP tasks
BERT-based Chinese language model for fill-mask and NLP tasks
Multimodal Transformer for document image understanding and layout
ClinicalBERT model trained on MIMIC notes for clinical NLP tasks
T5-Small: Lightweight text-to-text transformer for NLP tasks
Lightweight sentence embedding model for semantic search
Large multilingual RoBERTa model trained on 100 languages
English BERT model using cased text for sentence-level tasks
Compact GPT-style language model for open text generation and research
Large MLM-based English model optimized from BERT architecture
Zero-shot image-text classification with ViT-B/32 encoder.
BERT-base-uncased is a foundational English model for NLP tasks