Skip to content

Conversation

McPatate
Copy link
Member

@McPatate McPatate commented Feb 13, 2024

Given that we now let the user specify every parameter that will be sent to the backend, the hardcoded return_full_text param we relied on using huggingface is unset which results in showing the start or the full prompt to the user instead of the completion.

Adding return_full_text: false to the request body when backend == "huggingface" | "tgi".

Closes #76

@McPatate McPatate merged commit fe1f6aa into main Feb 13, 2024
@McPatate McPatate deleted the fix/missing_return_full_text branch February 13, 2024 10:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

codellama unusable with llm-ls 0.5.1
1 participant