-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Support prompt fragments and variable dependencies with recursive resolving #15196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support prompt fragments and variable dependencies with recursive resolving #15196
Conversation
Adds a new AI variable that resolves registered prompt templates based on their ID. The variable resolves to the resolved prompt template. I.e. variables and functions in the template are already resolved in the returned value. In the chat a prompt template with ID myprompt is referenced as #prompt:myprompt Adds prompt template id autocompletion for the chat window that triggers on entering the variable separator `:`
- different icon vor custom prompts - sort custom prompts first - add detail text specifying whether a prompt is built-in or custom - internationalize texts
Functions must not be immediately resolved in prompt fragments because function objects are lost this way. Instead, they are left in and resolved when the final chat message or prompt containing a prompt fragment is resolved. For prompts, all variables must be resolved before resolving functions to allow resolving functions in resolved variables. - Resolve functions after variables when getting a prompt - Extend PromptService with getPromptFragment method that resolves variables but not functions - ChatRequestParser can now handle chat and prompt function formats - Add unit test for prompt service testing resolving a function within a variable
- Move variable resolvement from chat service to chat request parser - Resolve functions in resolved variable texts - Add unit test for this
- Add interface AIVariableResolverWithVariableDependencies for variable resolvers that want to resolve dependencies. The variable service hands in a resolve method to these resolvers. - The AI variable service now recursively resolves all variables while providing cycle detection. If a cycle is found, the resolvement of only the recursive branch is stopped.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This works very well for me, thanks!
@planger Could you have a look at the implementation and architecture, please?
Three comments:
-
Message vs System prompt
The prompt is not added to the context as indicated in the video. From my POV this is expected. At the moment, adding a prompt to a chat is basically like pasting a String in a message. If you want to add something to the system prompt, you need to edit the agent prompt or create a custom agent. The question is whether we also want to allow editing the system prompt dynamically with variables. This could be done by adding a second variable that works exactly like the current one, but returns a context variable with a contextValue. @planger WDYT, is this a valuable use case? If so, I would not necessarily add it to this PR, but create a follow-up. However, if we want to add this later, we should now think about the naming.
Maybe
#prompt for the message
#prompt-system for adding it to system
or
#prompt-msg | #prompt-inline
#prompt-system | #prompt-global -
Auto-completion in the chat
The auto completion is a bit weird if you auto-complete the word "#prompt" as it is now auto completed to "#prompt:". You have to press CTRL+SPACE again then to get auto completion for the ID. For the file variable, the quick picker is used in this case. @planger What would you expect for this variable to happen? -
Auto-completion in the prompt editor
We should create a follow-up ticket for this please.
Hi @JonasHelming ,
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for this excellent work @lucas-koehler!
The architecture and code looks great to me. It might be worth letting someone else weigh in on this PR on code-level too, as it is a pretty substantial change.
2. Auto-completion in the chat
The auto completion is a bit weird if you auto-complete the word "#prompt" as it is now auto completed to "#prompt:". You have to press CTRL+SPACE again then to get auto completion for the ID. For the file variable, the quick picker is used in this case. @planger What would you expect for this variable to happen?
Yes, I'd agree it would be nicer to continue the suggestions after the user completed #prompt:
--- either via quick picks or via code completion items. If we prefer code completion items over quick picks, the prompt completion item would need to provide the 'editor.action.triggerSuggest' command in the completions
command` property, similar to here.
This would then trigger the suggestions, equivalent to the user typing Ctrl-space after the :
.
// Resolve all variables and add them to the variable parts. | ||
// Parse resolved variable texts again for tool requests. | ||
// These are not added to parts as they are not visible in the initial chat message. | ||
// However, add they need to be added to the result to be considered by the executing agent. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
// However, add they need to be added to the result to be considered by the executing agent. | |
// However, they need to be added to the result to be considered by the executing agent. |
To me, the current behavior is as expected. Allowing to add content to the system message via prompt fragments inserted into the user input doesn't feel like a much needed use case to me at the moment. |
@lucas-koehler OK, then we have calrity. I would be in favor of not useing the quick pick item, but keeping the auto completion in the chat input window, but fix the above described ":" case. |
135e3a6 fixes the wording suggestion by @planger and automatically triggers argument completion after entering the prompt variable. @planger I used the variable argument picker to trigger the argument autocompletion. However, I only trigger your suggested command there and do not actually pick an argument myself. Please have a brief look whether this solution is okay for you. Using the prompt-template-autocomplete-new.webm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code looks good to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wonderful! @planger If you could check Lucas comment, we might have a wrap!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @lucas-koehler! Looks good to me and works well.
I've created #15226 to track the removal for the rather awkward workaround to trigger editor suggestions instead of a picker. I think we should make this a more simple option for variable providers.
@lucas-koehler Let's merge! |
@planger Thank you ! That is definitely a good idea to improve eventually. For future reference, I also added it to the list of follow ups in the PR description |
…e-theia#15196) fixes eclipse-theia#14899 fixes eclipse-theia#15000 * Add prompt template AI variable Adds a new AI variable that resolves registered prompt templates based on their ID. The variable resolves to the resolved prompt template. I.e. variables and functions in the template are already resolved in the returned value. In the chat a prompt template with ID myprompt is referenced as #prompt:myprompt Adds prompt template id autocompletion for the chat window that triggers on entering the variable separator `:` and automatically after the `#prompt` variable has been autocompleted. * Resolve functions after variables in prompt service Functions must not be immediately resolved in prompt fragments because function objects are lost this way. Instead, they are left in and resolved when the final chat message or prompt containing a prompt fragment is resolved. For prompts, all variables must be resolved before resolving functions to allow resolving functions in resolved variables. - Resolve functions after variables when getting a prompt - Extend PromptService with getPromptFragment method that resolves variables but not functions - ChatRequestParser can now handle chat and prompt function formats - Add unit test for prompt service testing resolving a function within a variable * Resolve functions in resolved variables in chat request parser - Move variable resolvement from chat service to chat request parser - Resolve functions in resolved variable texts - Add unit test for this * Support explicit AI variable dependencies and recursive resolvement - Add interface AIVariableResolverWithVariableDependencies for variable resolvers that want to resolve dependencies. The variable service hands in a resolve method to these resolvers. - The AI variable service now recursively resolves all variables while providing cycle detection. If a cycle is found, the resolvement of only the recursive branch is stopped.
What it does
fixes #14899
fixes #15000
supersedes #14985
Summary: Support custom prompt fragments via variable
prompt
with prompt id as argument. Prompt fragments may reference each other recursively while the variable service ensures safe cycle handling.Prompt fragments allow to define parts of prompts in a reusable way. You can embed these prompts then via a variable in the chat or in our prompt templates. This also allows to have a set of prompts available in the chat without always defining a custom agent.
This PR defines a Theia Ai variable
prompt
with an argument containing the prompt fragment's id. Corresponding auto completion for available custom and builtin prompt fragments to the chat UI is implemented, too.Proper usage of prompt fragments - which can reference other variables and prompt fragments - requires two additional features implemented in this PR:
prompt-template-autocomplete.webm
How to test
Ai-features › Prompt Templates: Prompt Templates Folder
to point to a folder of your choosing.prompttemplate
files in the specified folder.#prompt:golang
and tell it to generate mergesort. You'll see it generates it in go@Universal
which usually does not have access to the workspace. Ask it to explain the contents of a file (use relative path) and use#prompt:workspace-functions
. Now the agent should read the file. Try again without the variable and see that it will not read the file{{prompt:workspace-functions}}
. Observe that you Universal agent now can access files even when you don't specify the variable in the chatFollow-ups
Breaking changes
Attribution
Review checklist
Reminder for reviewers