-
Notifications
You must be signed in to change notification settings - Fork 59
sync with upstream llama.cpp source code #171
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis update includes three main changes: the subproject commit for Changes
Poem
Tip ⚡💬 Agentic Chat (Pro Plan, General Availability)
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
src/wasm-from-cdn.ts (1)
1-2
: Automate version injection in generated file
This file is auto‑generated (// Do not edit this file directly
), so manually bumping the version here will be error‑prone. Consider updatingscripts/generate_wasm_from_cdn.js
to read the version frompackage.json
(e.g.,const { version } = require('../package.json')
) and interpolate it into the URLs. This will future‑proof your release process.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
src/multi-thread/wllama.wasm
is excluded by!**/*.wasm
src/single-thread/wllama.wasm
is excluded by!**/*.wasm
📒 Files selected for processing (3)
llama.cpp
(1 hunks)package.json
(1 hunks)src/wasm-from-cdn.ts
(1 hunks)
🔇 Additional comments (3)
package.json (1)
3-3
: Version bump approved. The update from"2.3.0"
to"2.3.1"
correctly aligns with the upstreamllama.cpp
sync and the updated CDN URLs in the WebAssembly loader.llama.cpp (1)
1-1
:✅ Verification successful
Verify submodule commit update exists upstream
Ensure that the new submodule reference (b9154ecff93ff54dc554411eb844a2a654be49f2
) actually exists in the remote llama.cpp repository and that the submodule is initialized and updated correctly.
🏁 Script executed:
#!/bin/bash # Verify the new submodule commit exists in the remote repository. remote_url=$(git config --file .gitmodules submodule.llama.cpp.url) echo "Checking remote URL: $remote_url" git ls-remote "$remote_url" | grep b9154ecff93ff54dc554411eb844a2a654be49f2 \ && echo "Commit found: OK" \ || echo "Error: Commit not found in remote"Length of output: 488
Submodule commit verified
The submodule URL in.gitmodules
(https://github.com/ggerganov/llama.cpp
) and the new commitb9154ecff93ff54dc554411eb844a2a654be49f2
have been confirmed upstream.• Ensure you initialize and sync the submodule locally:
git submodule update --init --recursivesrc/wasm-from-cdn.ts (1)
5-6
: Version bump correct
The CDN URLs now point to@wllama/wllama@2.3.1
for both single‑thread and multi‑thread binaries, matching the new package.json version.
@coderabbitai generate_wasm_from_cdn.ts is a generated script used to stored on cdn, so it is impossible to require('../package.json'). remember this |
Tip For best results, initiate chat on the files or code changes. @ngxson I understand! I've noted that ✏️ Learnings added
|
Summary by CodeRabbit