Skip to content

Ollam 0.5.9 run deepseek-r1:671b #9105

@MartinDong

Description

@MartinDong

What is the issue?

Hello, I'm encountering some issues while deploying the ollama run deepseek-r1:671b model. Could you help me figure out how to resolve them?
How can I load and run the 671b DeepSeekR1 model using Ollama's Docker container? My current device runs on a Linux system with the following hardware configuration:
CPU: 384 cores
Memory: 2304 GB
GPU: 8 × NVIDIA H20
Data Disk: 1 × 1500 GiB SSD cloud disk
Storage: 1 × 2048 GiB SSD cloud disk, 1 × 500 GiB SSD cloud disk
Ollama Version:ollama/ollama:0.5.9


[ollama7.1.txt](https://github.com/user-attachments/files/18800808/ollama7.1.txt)
[ollama7.txt](https://github.com/user-attachments/files/18800809/ollama7.txt)

Relevant log output

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.5.9

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions