Nous: Hermes 2 Mixtral 8x7B DPO
Mistral
Text
Paid
Nous Hermes 2 Mixtral 8x7B DPO is the new flagship Nous Research model trained over the [Mixtral 8x7B MoE LLM](/models/mistralai/mixtral-8x7b). The model was trained on over 1,000,000 entries of primarily [GPT-4](/models/openai/gpt-4) generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks. #moe
Parameters
~47B
Context Window
32,768
tokens
Input Price
$0.6
per 1M tokens
Output Price
$0.6
per 1M tokens
Capabilities
Model capabilities and supported modalities
Performance
Reasoning
Good reasoning with solid logical foundations
Math
-
Coding
-
Knowledge
-
Modalities
Input Modalities
text
Output Modalities
text
LLM Price Calculator
Calculate the cost of using this model
$0.000900
$0.001800
Input Cost:$0.000900
Output Cost:$0.001800
Total Cost:$0.002700
Estimated usage: 4,500 tokens
Monthly Cost Estimator
Based on different usage levels
Light Usage
$0.0120
~10 requests
Moderate Usage
$0.1200
~100 requests
Heavy Usage
$1.2000
~1000 requests
Enterprise
$12.0000
~10,000 requests
Note: Estimates based on current token count settings per request.
Last Updated: 2025/05/06