Zero-shot image-text model for classification and similarity tasks
Compact, efficient model for sentence embeddings and semantic search
GPT-2 is a 124M parameter English language model for text generation
Custom BLEURT model for evaluating text similarity using PyTorch
Robust BERT-based model for English with improved MLM training
Flexible text-to-text transformer model for multilingual NLP tasks
BERT-based Chinese language model for fill-mask and NLP tasks
Multimodal Transformer for document image understanding and layout
ClinicalBERT model trained on MIMIC notes for clinical NLP tasks
T5-Small: Lightweight text-to-text transformer for NLP tasks
Lightweight sentence embedding model for semantic search
Large multilingual RoBERTa model trained on 100 languages
English BERT model using cased text for sentence-level tasks
Compact GPT-style language model for open text generation and research
Large MLM-based English model optimized from BERT architecture
Zero-shot image-text classification with ViT-B/32 encoder.
BERT-base-uncased is a foundational English model for NLP tasks
A Python library to create and deploy cross-platform native context
A low code unified framework for computer vision and deep learning