Mistral
Mistral: Mixtral 8x22B (base)
mistralai/mixtral-8x22b
Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time.
It was released via X.
#moe
Community
Open Source
Context Window
65,536
Max Output Tokens
8,192
Using Mistral: Mixtral 8x22B (base) with Python API
Using Mistral: Mixtral 8x22B (base) with OpenAI compatible API
import openai
client = openai.Client(
api_key= '{your_api_key}',
base_url="https://api.model.box/v1",
)
response = client.chat.completions.create(
model="mistralai/mixtral-8x22b",
messages: [
{
role: 'user',
content:
'introduce your self',
},
]
)
print(response)