Skip to content

Add defaultrole to LLM pipeline #841

@davidmezzetti

Description

@davidmezzetti

Currently, the LLM pipeline assumes that string prompts already have all chat tokens applied.

This change will add an option to set the defaultrole on inference.

Options for defaultrole:

  • prompt (default): applies no chat formatting to input and passes raw to the model
  • user: creates chat messages with the user role

See this discussion for more: 8bd4d78#r150476159

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions