Mistral
Mistral

Mistral Small

mistralai/mistral-small

This model is currently powered by Mixtral-8X7B-v0.1, a sparse mixture of experts model with 12B active parameters. It has better reasoning, exhibits more capabilities, can produce and reason about code, and is multiligual, supporting English, French, German, Italian, and Spanish. #moe

Tools

Function Calling

Community

Open Source

Context Window

32,000

0

Using Mistral Small with Python API

Using Mistral Small with OpenAI compatible API

import openai

client = openai.Client(
  api_key= '{your_api_key}',
  base_url="https://api.model.box/v1",
)
response = client.chat.completions.create(
model="mistralai/mistral-small",
messages: [
  {
    role: 'user',
    content:
      'introduce your self',
    },
  ]
)
print(response)