Large multilingual RoBERTa model trained on 100 languages
English BERT model using cased text for sentence-level tasks
Compact GPT-style language model for open text generation and research
Large MLM-based English model optimized from BERT architecture
Zero-shot image-text classification with ViT-B/32 encoder.
BERT-base-uncased is a foundational English model for NLP tasks
A Python library to create and deploy cross-platform native context
A low code unified framework for computer vision and deep learning