Confichat (using it with Ollama) always gives me an error while trying to run a distilled Deepseek R1 model: `Error requesting chat completion: 404` Other LLMs work fine with Confichat and Ollama can run the Deepseek models in the console, but not in Confichat.