LogoTop AI Hubs

inclusionAI: Ling-1T

Other
Text
Paid

Ling-1T is a trillion-parameter open-weight large language model developed by inclusionAI and released under the MIT license. It represents the first flagship non-thinking model in the Ling 2.0 series, built around a sparse-activation architecture with roughly 50 billion active parameters per token. The model supports up to 128 K tokens of context and emphasizes efficient reasoning through an “Evolutionary Chain-of-Thought (Evo-CoT)” training strategy. Pre-trained on more than 20 trillion reasoning-dense tokens, Ling-1T achieves strong results across code generation, mathematics, and logical reasoning benchmarks while maintaining high inference efficiency. It employs FP8 mixed-precision training, MoE routing with QK normalization, and MTP layers for compositional reasoning stability. The model also introduces LPO (Linguistics-unit Policy Optimization) for post-training alignment, enhancing sentence-level semantic control. Ling-1T can perform complex text generation, multilingual reasoning, and front-end code synthesis with a focus on both functionality and aesthetics.

Parameters

1T

Context Window

131,072

tokens

Input Price

$0.57

per 1M tokens

Output Price

$2.28

per 1M tokens

Capabilities

Model capabilities and supported modalities

Performance

Reasoning

Excellent reasoning capabilities with strong logical analysis

Math

Strong mathematical capabilities, handles complex calculations well

Coding

Specialized in code generation with excellent programming capabilities

Knowledge

-

Modalities

Input Modalities

text

Output Modalities

text

LLM Price Calculator

Calculate the cost of using this model

$0.000855
$0.006840
Input Cost:$0.000855
Output Cost:$0.006840
Total Cost:$0.007695
Estimated usage: 4,500 tokens

Monthly Cost Estimator

Based on different usage levels

Light Usage
$0.0285
~10 requests
Moderate Usage
$0.2850
~100 requests
Heavy Usage
$2.8500
~1000 requests
Enterprise
$28.5000
~10,000 requests
Note: Estimates based on current token count settings per request.
Last Updated: 1970/01/21