DeepSeek-Coder-V2
Other
Code
Paid
DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model. It is further pre-trained from an intermediate checkpoint of DeepSeek-V2 with additional 6 trillion tokens. The original V1 model was trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. It was pre-trained on project-level code corpus by employing a extra fill-in-the-blank task.
Parameters
-
Context Window
128,000
tokens
Input Price
$0.04
per 1M tokens
Output Price
$0.12
per 1M tokens
Capabilities
Model capabilities and supported modalities
Performance
Reasoning
-
Math
-
Coding
Specialized in code generation with excellent programming capabilities
Knowledge
-
Modalities
Input Modalities
text
Output Modalities
text
LLM Price Calculator
Calculate the cost of using this model
$0.000060
$0.000360
Input Cost:$0.000060
Output Cost:$0.000360
Total Cost:$0.000420
Estimated usage: 4,500 tokens
Monthly Cost Estimator
Based on different usage levels
Light Usage
$0.0016
~10 requests
Moderate Usage
$0.0160
~100 requests
Heavy Usage
$0.1600
~1000 requests
Enterprise
$1.6000
~10,000 requests
Note: Estimates based on current token count settings per request.
Last Updated: 2025/05/06