cognitivecomputations

Dolphin

cognitivecomputations/dolphin-2.6-mixtral-8x7b

This model is based on Mixtral-8x7b The base model has 32k context, I finetuned it with 16k. This Dolphin is really good at coding, I trained with a lot of coding data. It is very obedient but it is not DPO tuned - so you still might need to encourage it in the system prompt as I show in the below examples.

Community

Open Source

Context Window

16,000

0
ProviderInput Token PriceOutput Token Price
DeepInfra$0.24/Million Tokens$0.24/Million Tokens
TogetherAI$0.60/Million Tokens$0.6/Million Tokens