Mistral
Mistral

Mistral Small

mistralai/mistral-small

This model is currently powered by Mixtral-8X7B-v0.1, a sparse mixture of experts model with 12B active parameters. It has better reasoning, exhibits more capabilities, can produce and reason about code, and is multiligual, supporting English, French, German, Italian, and Spanish. #moe

Tools

Function Calling

Community

Open Source

Context Window

32,000

0
ProviderInput Token PriceOutput Token Price
Mistral$2.00/Million Tokens$6/Million Tokens