Mistral: Mixtral 8x22B Instruct
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
Parameters
22B
Context Window
65,536
tokens
Input Price
$0.4
per 1M tokens
Output Price
$1.2
per 1M tokens
Capabilities
Model capabilities and supported modalities
Performance
Excellent reasoning capabilities with strong logical analysis
Strong mathematical capabilities, handles complex calculations well
Capable of generating functional code with good practices
-
Modalities
text
text
LLM Price Calculator
Calculate the cost of using this model
Monthly Cost Estimator
Based on different usage levels