Mistral
Mistral

Mistral: Mixtral 8x22B Instruct

mistralai/mixtral-8x22b-instruct

Mistral's official instruct fine-tuned version of Mixtral 8x22B. It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include:

  • strong math, coding, and reasoning
  • large context length (64k)
  • fluency in English, French, Italian, German, and Spanish

See benchmarks on the launch announcement here. #moe

Community

Open Source

Context Window

65,536

Max Output Tokens

8,192

ProviderInput Token PriceOutput Token Price
OctoAI$1.20/Million Tokens$1.2/Million Tokens
TogetherAI$1.20/Million Tokens$1.2/Million Tokens