Mistral AI API
API Key​
# env variable
os.environ['MISTRAL_API_KEY']
Sample Usage​
from litellm import completion
import os
os.environ['MISTRAL_API_KEY'] = ""
response = completion(
model="mistral/mistral-tiny",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)
Sample Usage - Streaming​
from litellm import completion
import os
os.environ['MISTRAL_API_KEY'] = ""
response = completion(
model="mistral/mistral-tiny",
messages=[
{"role": "user", "content": "hello from litellm"}
],
stream=True
)
for chunk in response:
print(chunk)
Supported Models​
All models listed here https://docs.mistral.ai/platform/endpoints are supported. We actively maintain the list of models, pricing, token window, etc. here.
Model Name | Function Call |
---|---|
mistral-tiny | completion(model="mistral/mistral-tiny", messages) |
mistral-small | completion(model="mistral/mistral-small", messages) |
mistral-medium | completion(model="mistral/mistral-medium", messages) |