Mistral
Mistral

Mistral: Mixtral 8x22B (base)

mistralai/mixtral-8x22b

Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time.

It was released via X.

#moe

Community

Open Source

Context Window

65,536

Max Output Tokens

8,192

ProviderInput Token PriceOutput Token Price
Mistral$0.90/Million Tokens$0.9/Million Tokens
TogetherAI$1.20/Million Tokens$1.2/Million Tokens