Skip to content

Bug: Lab generate crashing when trying to chat in parallel #346

@sroecker

Description

@sroecker

This seems to be a bug or limitation of the llama-cpp-python backend:
abetlen/llama-cpp-python#257

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingdocumentationImprovements or additions to documentation

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions