Powerful AI language model (MoE) optimized for efficiency/performance
Open-source, high-performance AI model with advanced reasoning
Mixture-of-Experts Vision-Language Models for Advanced Multimodal
Strong, Economical, and Efficient Mixture-of-Experts Language Model
DeepSeek Coder: Let the Code Write Itself
Contexts Optical Compression
Towards Real-World Vision-Language Understanding
DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models
Pushing the Limits of Mathematical Reasoning in Open Language Models
Visual Causal Flow
An experimental version of DeepSeek model
Advancing Formal Mathematical Reasoning via Reinforcement Learning
From Vibe Coding to Agentic Engineering
DeepSeek LLM: Let there be answers
AI-powered research assistant that performs iterative, deep research
Unleash Next-Level AI
A bidirectional pipeline parallelism algorithm
Alibaba's high-performance LLM inference engine for diverse apps
Analyze computation-communication overlap in V3/R1
A high-performance distributed file system
Towards Ultimate Expert Specialization in Mixture-of-Experts Language
GitLab automatic code review tool based on large models
Agentic, Reasoning, and Coding (ARC) foundation models
Run local LLMs like llama, deepseek, kokoro etc. inside your browser
Automated translation solution for visual novels