Skip to content

Deepseek R1 Distilled models not working #12

@Silvicultor

Description

@Silvicultor

Confichat (using it with Ollama) always gives me an error while trying to run a distilled Deepseek R1 model:

Error requesting chat completion: 404

Other LLMs work fine with Confichat and Ollama can run the Deepseek models in the console, but not in Confichat.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions