1.5M Downloads Updated 1 year ago
ollama run deepseek-coder-v2:236b-base-q4_0
Updated 1 year ago
1 year ago
2dc89d24571b · 133GB ·
DeepSeek-Coder-V2 is an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. DeepSeek-Coder-V2 is further pre-trained from DeepSeek-Coder-V2-Base with 6 trillion tokens sourced from a high-quality and multi-source corpus.