Skip to content

ENH: Support xllamacpp #2997

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
Mar 9, 2025

Conversation

codingl2k1
Copy link
Contributor

@codingl2k1 codingl2k1 commented Mar 4, 2025

Disabled by default, can be enabled by setting env USE_XLLAMACPP=1

@XprobeBot XprobeBot added the enhancement New feature or request label Mar 4, 2025
@XprobeBot XprobeBot added this to the v1.x milestone Mar 4, 2025
@codingl2k1 codingl2k1 marked this pull request as ready for review March 6, 2025 20:03
Copy link
Contributor

@qinxuye qinxuye left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really excellent work, I left some comments.

@qinxuye
Copy link
Contributor

qinxuye commented Mar 7, 2025

I found that now the lock would exist for Xllamacpp, and I added a commit to release it.

@qinxuye qinxuye force-pushed the feat/concurrent_llama_cpp_server branch from 39f23db to f54db47 Compare March 8, 2025 12:54
@qinxuye qinxuye force-pushed the feat/concurrent_llama_cpp_server branch from 2cd5059 to 7eea8f8 Compare March 9, 2025 03:06
Copy link
Contributor

@qinxuye qinxuye left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@qinxuye qinxuye merged commit 5d6ec93 into xorbitsai:main Mar 9, 2025
11 of 13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants