-
Notifications
You must be signed in to change notification settings - Fork 59
sync with latest upstream llama.cpp #187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThe subproject commit reference for Changes
Possibly related PRs
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (2)
✅ Files skipped from review due to trivial changes (2)
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
src/multi-thread/wllama.wasm
is excluded by!**/*.wasm
src/single-thread/wllama.wasm
is excluded by!**/*.wasm
📒 Files selected for processing (1)
llama.cpp
(1 hunks)
@@ -1 +1 @@ | |||
Subproject commit 745aa5319b9930068aff5e87cf5e9eef7227339b | |||
Subproject commit bdca38376f7e8dd928defe01ce6a16218a64b040 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Confirm new submodule revision is reachable
The pointer update is fine, but please verify that commit bdca38376f7e8dd928defe01ce6a16218a64b040
exists on the upstream llama.cpp
repo and that CI runs git submodule update --init --recursive
after the change.
🏁 Script executed:
#!/bin/bash
# Ensure the commit is present in the remote repository
git ls-remote https://github.com/ggerganov/llama.cpp.git \
| grep -q bdca38376f7e8dd928defe01ce6a16218a64b040 \
&& echo "Commit found ✅" \
|| { echo "❌ Commit not found"; exit 1; }
Length of output: 148
🏁 Script executed:
#!/bin/bash
# Verify CI invokes `git submodule update --init --recursive`
rg -n "git submodule update --init --recursive" -A2
# Fallback: any mention of submodule update in CI scripts
rg -n "submodule update" -A2
Length of output: 198
Ensure CI updates submodules after pointer bump
Commit bdca38376f7e8dd928defe01ce6a16218a64b040 was found on the upstream repo ✅. However, I didn’t see any CI step running git submodule update --init --recursive
. Please add a job in your CI configuration (e.g. .github/workflows/ci.yml
) such as:
- name: Initialize llama.cpp submodule
run: git submodule update --init --recursive
🤖 Prompt for AI Agents
In the CI configuration file (e.g., .github/workflows/ci.yml), add a new job
step to run 'git submodule update --init --recursive' to ensure submodules are
properly initialized and updated after pointer bumps. This step should be placed
early in the workflow to guarantee the submodules are ready before any build or
test steps execute.
Summary by CodeRabbit