cognitivecomputations
Dolphin
cognitivecomputations/dolphin-2.6-mixtral-8x7b
This model is based on Mixtral-8x7b The base model has 32k context, I finetuned it with 16k. This Dolphin is really good at coding, I trained with a lot of coding data. It is very obedient but it is not DPO tuned - so you still might need to encourage it in the system prompt as I show in the below examples.
Community
Open Source
Context Window
16,000