Mistral
Mistral

Mixtral 8x7B (base)

mistralai/mixtral-8x7b

A pretrained generative Sparse Mixture of Experts, by Mistral AI. Incorporates 8 experts (feed-forward networks) for a total of 47B parameters. Base model (not fine-tuned for instructions) - see Mixtral 8x7B Instruct for an instruct-tuned model.

#moe

Community

Open Source

Context Window

32,768

Max Output Tokens

8,192

Using Mixtral 8x7B (base) with Python API

Using Mixtral 8x7B (base) with OpenAI compatible API

import openai

client = openai.Client(
  api_key= '{your_api_key}',
  base_url="https://api.model.box/v1",
)
response = client.chat.completions.create(
model="mistralai/mixtral-8x7b",
messages: [
  {
    role: 'user',
    content:
      'introduce your self',
    },
  ]
)
print(response)