Mistral: Ministral 8B
Mistral
Text
Paid
Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.
Parameters
8B
Context Window
128,000
tokens
Input Price
$0.1
per 1M tokens
Output Price
$0.1
per 1M tokens
Capabilities
Model capabilities and supported modalities
Performance
Reasoning
Excellent reasoning capabilities with strong logical analysis
Math
-
Coding
Capable of generating functional code with good practices
Knowledge
Extensive knowledge base with broad coverage of topics
Modalities
Input Modalities
text
Output Modalities
text
LLM Price Calculator
Calculate the cost of using this model
$0.000150
$0.000300
Input Cost:$0.000150
Output Cost:$0.000300
Total Cost:$0.000450
Estimated usage: 4,500 tokens
Monthly Cost Estimator
Based on different usage levels
Light Usage
$0.0020
~10 requests
Moderate Usage
$0.0200
~100 requests
Heavy Usage
$0.2000
~1000 requests
Enterprise
$2.0000
~10,000 requests
Note: Estimates based on current token count settings per request.
Last Updated: 2025/05/06