Mistral
Codestral Mamba
mistralai/codestral-mamba-latest
Codestral Mamba is a newly released language model specialized in code generation, developed by Mistral AI. It boasts linear time inference, allowing it to efficiently handle sequences of infinite length, making it ideal for code productivity tasks. The model was trained with advanced code and reasoning capabilities, performing comparably to state-of-the-art transformer models. It supports extensive in-context retrieval up to 256k tokens. Codestral Mamba is freely available under the Apache 2.0 license and can be deployed via the mistral-inference SDK or TensorRT-LLM.
For more details, visit the original article.
Community
Open Source
Context Window
256,000
Using Codestral Mamba with Python API
Using Codestral Mamba with OpenAI compatible API
import openai
client = openai.Client(
api_key= '{your_api_key}',
base_url="https://api.model.box/v1",
)
response = client.chat.completions.create(
model="mistralai/codestral-mamba-latest",
messages: [
{
role: 'user',
content:
'introduce your self',
},
]
)
print(response)