Open-source, high-performance AI model with advanced reasoning
Powerful AI language model (MoE) optimized for efficiency/performance
Contexts Optical Compression
Mixture-of-Experts Vision-Language Models for Advanced Multimodal
DeepSeek Coder: Let the Code Write Itself
Pushing the Limits of Mathematical Reasoning in Open Language Models
Towards Real-World Vision-Language Understanding
An experimental version of DeepSeek model
Visual Causal Flow
Towards Ultimate Expert Specialization in Mixture-of-Experts Language
Agentic, Reasoning, and Coding (ARC) foundation models
Unified Multimodal Understanding and Generation Models
High-efficiency reasoning and agentic intelligence model
High-compute ultra-reasoning model surpassing model surpassing GPT-5