LogoTop AI Hubs

Mistral: Mixtral 8x22B Instruct

Mistral
Text
Paid

Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe

Parameters

22B

Context Window

65,536

tokens

Input Price

$0.4

per 1M tokens

Output Price

$1.2

per 1M tokens

Capabilities

Model capabilities and supported modalities

Performance

Reasoning

Excellent reasoning capabilities with strong logical analysis

Math

Strong mathematical capabilities, handles complex calculations well

Coding

Capable of generating functional code with good practices

Knowledge

-

Modalities

Input Modalities

text

Output Modalities

text

LLM Price Calculator

Calculate the cost of using this model

$0.000600
$0.003600
Input Cost:$0.000600
Output Cost:$0.003600
Total Cost:$0.004200
Estimated usage: 4,500 tokens

Monthly Cost Estimator

Based on different usage levels

Light Usage
$0.0160
~10 requests
Moderate Usage
$0.1600
~100 requests
Heavy Usage
$1.6000
~1000 requests
Enterprise
$16.0000
~10,000 requests
Note: Estimates based on current token count settings per request.
Last Updated: 2025/05/06