Mistral
Mistral

Mixtral 8x7B Instruct v0.1

mistralai/mixtral-8x7b-instruct-v0.1

A pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.

Instruct model fine-tuned by Mistral. #moe

Community

Open Source

Context Window

32,768

0
ProviderInput Token PriceOutput Token Price
Groq$0.24/Million Tokens$0.24/Million Tokens
TogetherAI$0.60/Million Tokens$0.6/Million Tokens