Skip to content

Can not use solidrust/Codestral-22B-v0.1-hf-AWQ from Ollama API

We are using models from ISDM Chat using the Ollama API https://chat.crocc.meso.umontpellier.fr/ollama/api.

Mixtral (mixtral:8x7b-instruct-v0.1-q5_0) works well but Codestral (mixtral:8x7b-instruct-v0.1-q5_0) doesn't.

Here is how we use them:

  1. With Python
from langchain_community.chat_models import ChatOllama
from langchain_core.messages import HumanMessage
from langchain_core.output_parsers import StrOutputParser
from langchain.prompts import PromptTemplate

# LLM_MODEL = "mixtral:8x7b-instruct-v0.1-q5_0"
LLM_MODEL = "solidrust/Codestral-22B-v0.1-hf-AWQ"
LLM_JWT_BEARER = "...secret..."
LLM_API_URL = "https://chat.crocc.meso.umontpellier.fr/ollama"

llm = ChatOllama(model=LLM_MODEL, base_url=LLM_API_URL,
headers={"Authorization": "Bearer " + LLM_JWT_BEARER,"Content-Type":"application/json",})
  1. With cURL:
curl -X POST \
  -H "Authorization: Bearer ...secret..." \
  -H "Content-Type: application/json" \
  https://chat.crocc.meso.umontpellier.fr/ollama/api/generate \
  -d '{
    "model": "solidrust/Codestral-22B-v0.1-hf-AWQ",
    "prompt": "Pourquoi le ciel est-il bleu ?"
  }'
  1. Error: The error reported is: {"detail":"Model 'solidrust/Codestral-22B-v0.1-hf-AWQ' was not found"}

I couldn't find the right call for listing the models provided by the ollama API. I used:

curl -H "Authorization: Bearer ...secret..." -H "Content-Type: application/json" https://chat.crocc.meso.umontpellier.fr/ollama/api/tag

The error is: {"detail":"Open WebUI: Server Connection Error"}