Mistral
Mixtral 8x7B (base)
mistralai/mixtral-8x7b
A pretrained generative Sparse Mixture of Experts, by Mistral AI. Incorporates 8 experts (feed-forward networks) for a total of 47B parameters. Base model (not fine-tuned for instructions) - see Mixtral 8x7B Instruct for an instruct-tuned model.
#moe
Community
Open Source
Context Window
32,768
Max Output Tokens
8,192