MedicalGPT: Training Your Own Medical GPT Model with ChatGPT Training
ERNIE 4.5 MoE model in FP8 for efficient high-performance inference
Small, high-performing language model for QA, chat, and code tasks
Open-weight, large-scale hybrid-attention reasoning model
Extension for Stable Diffusion using edge, depth, pose, and more
GPT-2 is a 124M parameter English language model for text generation