Skip to content

Please provide build instructions #7

@McBane87

Description

@McBane87

Building for Linux (Docker):

Please have a look at the Dockerfile provided in #7 (comment).

Building for Linux (Natively):

Linux distributions can be very different, but as a starter, just have a look at the Dockerfile mentioned above.
You can extract the instruction/ideas for building on Linux.

Building for Windows:

Please see #7 (comment)

Prebuild stuff

Docker Images

  • ghcr.io/wilgnne/ollama-vulkan built by @wilgnne (Outdated)
  • docker.io/grinco/ollama-amd-apu:vulkan built by @grinco

Windows

Older Versions

Vulkan build for v0.6.1:

Vulkan build for v0.6.2:

Vulkan build for v0.6.3:

Vulkan build for v0.6.4:

Vulkan build for v0.6.5:

Vulkan build for v0.6.6:

Vulkan build for v0.6.8:

Vulkan build for v0.8.0:

Vulkan build for v0.9.0:

Vulkan build for v0.9.3:


Original Post:
Thanks for enabling the issues in your fork.
As already mentioned in ollama#5059 (comment), I'm kindof stuck in building ollama with vulkan support, because I don't know how to build it correctly. Can you please give a hint on how to build or maybe even better create build instructions in general?

I also saw, in the meantime, you already merged the latest ollama commits along with build instructions given in ollama#5059 (comment). I also tried that yesterday as well by patching the original ollama repo with your pr and adding the mentioned cmake changes. Which should be the same as the merge, you did recently.

git clone https://github.com/ollama/ollama.git "/tmp/ollama-vulkan-git"
git config user.name "Builder" && git config user.email "builder@local"

# Merge PR of pufferffish
pr=5059;  git fetch origin pull/$pr/head:pr-$pr && git merge --no-edit pr-$pr

# Add pufferffish repo as additional "upstream"
git remote add upstream https://github.com/pufferffish/ollama-vulkan.git

# Additionally merge a PR from pufferffish repo ("upstream")
pr=5; git fetch upstream pull/$pr/head:pr-$pr && git merge --no-edit pr-$pr

# Patch the cmake adjustements for Vulkan
patch -p1 < /tmp/patches/01-cmake.patch

Additionally I had to fix bug after the merge:

--- a/discover/gpu.go	2025-02-02 14:41:15.178673705 +0000
+++ b/discover/gpu.go	2025-02-02 14:41:19.541625469 +0000
@@ -425,7 +425,7 @@
 				gpuInfo.ID = C.GoString(&memInfo.gpu_id[0])
 				gpuInfo.Compute = fmt.Sprintf("%d.%d", memInfo.major, memInfo.minor)
 				gpuInfo.MinimumMemory = 0
-				gpuInfo.DependencyPath = depPaths
+				gpuInfo.DependencyPath = []string{LibOllamaPath}
 				gpuInfo.Name = C.GoString(&memInfo.gpu_name[0])
 				gpuInfo.DriverMajor = int(memInfo.major)
 				gpuInfo.DriverMinor = int(memInfo.minor)

And also, the comment was saying to use cmake --preset Vulkan; cmake --build --preset Vulkan, but this isn't working, because there is no "Vulkan" preset in CMakePresets.json. So I added a preset, which was nothing more then a guess, because I don't know how the preset should look like:

--- a/CMakePresets.json	2025-02-02 12:31:21.293594051 +0000
+++ b/CMakePresets.json	2025-02-02 13:09:40.207427326 +0000
@@ -58,6 +58,10 @@
       "cacheVariables": {
         "AMDGPU_TARGETS": "gfx900;gfx940;gfx941;gfx942;gfx1010;gfx1012;gfx1030;gfx1100;gfx1101;gfx1102"
       }
+    },
+    {
+      "name": "Vulkan",
+      "inherits": [ "Default" ]
     }
   ],
   "buildPresets": [
@@ -105,6 +109,11 @@
       "name": "ROCm 6",
       "inherits": [ "ROCm" ],
       "configurePreset": "ROCm 6"
-    }
+    },
+    {
+      "name": "Vulkan",
+      "targets": [ "ggml-vulkan" ],
+      "configurePreset": "Vulkan"
+      }
   ]
 }
\ No newline at end of file

After that I was able to build using:

make -f Makefile.sync clean sync

# Build for CPU
cmake --preset CPU
cmake --build --parallel --preset CPU
cmake --install build --component CPU --strip

# Build for Vulkan
cmake --preset Vulkan
cmake --build --parallel --preset Vulkan
cmake --install build --component Vulkan --strip

# Build Ollama binary
go build -trimpath -buildmode=pie -o dist/bin/ollama

Then I get this result:

dist/linux-amd64/lib/ollama
dist/linux-amd64/lib/ollama/libggml-base.so
dist/linux-amd64/lib/ollama/libggml-cpu-alderlake.so
dist/linux-amd64/lib/ollama/libggml-cpu-haswell.so
dist/linux-amd64/lib/ollama/libggml-cpu-icelake.so
dist/linux-amd64/lib/ollama/libggml-cpu-sandybridge.so
dist/linux-amd64/lib/ollama/libggml-cpu-skylakex.so
dist/linux-amd64/lib/ollama/libggml-cpu-sapphirerapids.so
dist/linux-amd64/lib/ollama/vulkan
dist/linux-amd64/lib/ollama/vulkan/libvulkan.so.1
dist/linux-amd64/lib/ollama/vulkan/libggml-vulkan.so
dist/linux-amd64/lib/ollama/vulkan/libvulkan.so.1.3.296
dist/linux-amd64/bin/ollama

But I still don't see any hint it is working. When I try to use the ollama I just have build it only uses cpu runner.

After that, I tried to build without your patch. Just plain ollama repo with cmake patch and cmake preset patch. Because it felt weired, that none of the files used for building (my guessed) preset "Vulkan" are present in your patch. And it was building the same files as with your patch. Thats leads me to the conclusion, my presets patch is wrong and something different needs to be put there. And again I was confsued how to build it correctly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions