Skip to content

Conversation

lucas-koehler
Copy link
Contributor

@lucas-koehler lucas-koehler commented Mar 14, 2025

What it does

fixes #14899
fixes #15000
supersedes #14985

Summary: Support custom prompt fragments via variable prompt with prompt id as argument. Prompt fragments may reference each other recursively while the variable service ensures safe cycle handling.


Prompt fragments allow to define parts of prompts in a reusable way. You can embed these prompts then via a variable in the chat or in our prompt templates. This also allows to have a set of prompts available in the chat without always defining a custom agent.

This PR defines a Theia Ai variable prompt with an argument containing the prompt fragment's id. Corresponding auto completion for available custom and builtin prompt fragments to the chat UI is implemented, too.

Proper usage of prompt fragments - which can reference other variables and prompt fragments - requires two additional features implemented in this PR:

  1. Resolving of tool functions in resolved variable text in the prompt service and the chat request parser. Both have been extended to provide this.
  2. Recursive resolving of variables including cycle detection. Prompt fragments may compose other prompt fragments. Thus, they need to be resolved recursively. Furthermore, cycles are detected and resolving aborted for the cycle to a void crashing Theia in an endless loop in case users configure a cycle.
prompt-template-autocomplete.webm

How to test

  • Configure setting Ai-features › Prompt Templates: Prompt Templates Folder to point to a folder of your choosing
  • Create one or multiple .prompttemplate files in the specified folder.
    • For recursion testing, you need at least two referencing each other
    • Find example prompt templates here: prompttemplates.zip. Add them to you prompt templates folder
  • Open the AI chat and test using templates there (see video above)
    • for instance use #prompt:golang and tell it to generate mergesort. You'll see it generates it in go
    • Use agent @Universal which usually does not have access to the workspace. Ask it to explain the contents of a file (use relative path) and use #prompt:workspace-functions. Now the agent should read the file. Try again without the variable and see that it will not read the file
  • Edit the prompt editor for an agent (e.g. Universal again). Reference prompt templates there via {{prompt:workspace-functions}}. Observe that you Universal agent now can access files even when you don't specify the variable in the chat

Follow-ups

Breaking changes

  • This PR introduces breaking changes and requires careful review. If yes, the breaking changes section in the changelog has been updated.

Attribution

Review checklist

Reminder for reviewers

Adds a new AI variable that resolves registered prompt templates based on their ID.
The variable resolves to the resolved prompt template. I.e. variables and functions in the template are already resolved in the returned value.

In the chat a prompt template with ID myprompt is referenced as #prompt:myprompt

Adds prompt template id autocompletion for the chat window that triggers on entering the variable separator `:`
- different icon vor custom prompts
- sort custom prompts first
- add detail text specifying whether a prompt is built-in or custom
- internationalize texts
Functions must not be immediately resolved in prompt fragments because function objects are lost this way.
Instead, they are left in and resolved when the final chat message or prompt containing a prompt fragment is resolved.
For prompts, all variables must be resolved before resolving functions to allow resolving functions in resolved variables.

- Resolve functions after variables when getting a prompt
- Extend PromptService with getPromptFragment method that resolves variables but not functions
- ChatRequestParser can now handle chat and prompt function formats
- Add unit test for prompt service testing resolving a function within a variable
- Move variable resolvement from chat service to chat request parser
- Resolve functions in resolved variable texts
- Add unit test for this
- Add interface AIVariableResolverWithVariableDependencies for variable resolvers that want to resolve dependencies.
  The variable service hands in a resolve method to these resolvers.
- The AI variable service now recursively resolves all variables while providing cycle detection.
  If a cycle is found, the resolvement of only the recursive branch is stopped.
@lucas-koehler lucas-koehler added the theia-ai issues related to TheiaAI label Mar 14, 2025
@github-project-automation github-project-automation bot moved this to Waiting on reviewers in PR Backlog Mar 14, 2025
Copy link
Contributor

@JonasHelming JonasHelming left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This works very well for me, thanks!

@planger Could you have a look at the implementation and architecture, please?

Three comments:

  1. Message vs System prompt
    The prompt is not added to the context as indicated in the video. From my POV this is expected. At the moment, adding a prompt to a chat is basically like pasting a String in a message. If you want to add something to the system prompt, you need to edit the agent prompt or create a custom agent. The question is whether we also want to allow editing the system prompt dynamically with variables. This could be done by adding a second variable that works exactly like the current one, but returns a context variable with a contextValue. @planger WDYT, is this a valuable use case? If so, I would not necessarily add it to this PR, but create a follow-up. However, if we want to add this later, we should now think about the naming.
    Maybe
    #prompt for the message
    #prompt-system for adding it to system
    or
    #prompt-msg | #prompt-inline
    #prompt-system | #prompt-global

  2. Auto-completion in the chat
    The auto completion is a bit weird if you auto-complete the word "#prompt" as it is now auto completed to "#prompt:". You have to press CTRL+SPACE again then to get auto completion for the ID. For the file variable, the quick picker is used in this case. @planger What would you expect for this variable to happen?

  3. Auto-completion in the prompt editor
    We should create a follow-up ticket for this please.

@lucas-koehler
Copy link
Contributor Author

lucas-koehler commented Mar 17, 2025

Hi @JonasHelming ,
thanks for the feedback!

  1. Yes, the prompt fragment is not added to the context. This is intentional as discussed. The video was done before the addition to the context was removed again. Sorry for the confustion
  2. -- (waiting for comment from @planger )
  3. I created issue Support variable auto completion in AI prompt editor #15202 for this

Copy link
Contributor

@planger planger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for this excellent work @lucas-koehler!

The architecture and code looks great to me. It might be worth letting someone else weigh in on this PR on code-level too, as it is a pretty substantial change.

2. Auto-completion in the chat
The auto completion is a bit weird if you auto-complete the word "#prompt" as it is now auto completed to "#prompt:". You have to press CTRL+SPACE again then to get auto completion for the ID. For the file variable, the quick picker is used in this case. @planger What would you expect for this variable to happen?

Yes, I'd agree it would be nicer to continue the suggestions after the user completed #prompt: --- either via quick picks or via code completion items. If we prefer code completion items over quick picks, the prompt completion item would need to provide the 'editor.action.triggerSuggest' command in the completions command` property, similar to here.

This would then trigger the suggestions, equivalent to the user typing Ctrl-space after the :.

// Resolve all variables and add them to the variable parts.
// Parse resolved variable texts again for tool requests.
// These are not added to parts as they are not visible in the initial chat message.
// However, add they need to be added to the result to be considered by the executing agent.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// However, add they need to be added to the result to be considered by the executing agent.
// However, they need to be added to the result to be considered by the executing agent.

@planger
Copy link
Contributor

planger commented Mar 17, 2025

  1. Message vs System prompt
    The prompt is not added to the context as indicated in the video. From my POV this is expected. At the moment, adding a prompt to a chat is basically like pasting a String in a message. If you want to add something to the system prompt, you need to edit the agent prompt or create a custom agent. The question is whether we also want to allow editing the system prompt dynamically with variables. This could be done by adding a second variable that works exactly like the current one, but returns a context variable with a contextValue. @planger WDYT, is this a valuable use case? If so, I would not necessarily add it to this PR, but create a follow-up. However, if we want to add this later, we should now think about the naming.

To me, the current behavior is as expected. Allowing to add content to the system message via prompt fragments inserted into the user input doesn't feel like a much needed use case to me at the moment.

@JonasHelming
Copy link
Contributor

@lucas-koehler OK, then we have calrity. I would be in favor of not useing the quick pick item, but keeping the auto completion in the chat input window, but fix the above described ":" case.

@JonasHelming JonasHelming requested a review from eneufeld March 18, 2025 08:11
@JonasHelming JonasHelming mentioned this pull request Mar 18, 2025
25 tasks
@lucas-koehler
Copy link
Contributor Author

135e3a6 fixes the wording suggestion by @planger and automatically triggers argument completion after entering the prompt variable.

@planger I used the variable argument picker to trigger the argument autocompletion. However, I only trigger your suggested command there and do not actually pick an argument myself. Please have a brief look whether this solution is okay for you. Using the command property of a completion does not work that easily here because the base variable names are provided by the default completion provider that already registers said argument picker command.

prompt-template-autocomplete-new.webm

Copy link
Contributor

@eneufeld eneufeld left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code looks good to me.

@github-project-automation github-project-automation bot moved this from Waiting on reviewers to Needs merge in PR Backlog Mar 18, 2025
Copy link
Contributor

@JonasHelming JonasHelming left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wonderful! @planger If you could check Lucas comment, we might have a wrap!

Copy link
Contributor

@planger planger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @lucas-koehler! Looks good to me and works well.

I've created #15226 to track the removal for the rather awkward workaround to trigger editor suggestions instead of a picker. I think we should make this a more simple option for variable providers.

@JonasHelming
Copy link
Contributor

@lucas-koehler Let's merge!

@lucas-koehler
Copy link
Contributor Author

lucas-koehler commented Mar 19, 2025

Thank you @lucas-koehler! Looks good to me and works well.

I've created #15226 to track the removal for the rather awkward workaround to trigger editor suggestions instead of a picker. I think we should make this a more simple option for variable providers.

@planger Thank you ! That is definitely a good idea to improve eventually. For future reference, I also added it to the list of follow ups in the PR description

@lucas-koehler lucas-koehler merged commit e647b18 into master Mar 19, 2025
9 of 11 checks passed
@github-project-automation github-project-automation bot moved this from Needs merge to Done in PR Backlog Mar 19, 2025
@github-actions github-actions bot added this to the 1.60.0 milestone Mar 19, 2025
@lucas-koehler lucas-koehler deleted the issues/14899-reference-prompt-fragments branch March 19, 2025 09:11
laemmleint pushed a commit to mvtecsoftware/theia that referenced this pull request Aug 18, 2025
…e-theia#15196)

fixes eclipse-theia#14899
fixes eclipse-theia#15000

* Add prompt template AI variable

Adds a new AI variable that resolves registered prompt templates based on their ID.
The variable resolves to the resolved prompt template. I.e. variables and functions in the template are already resolved in the returned value.

In the chat a prompt template with ID myprompt is referenced as #prompt:myprompt

Adds prompt template id autocompletion for the chat window that triggers on entering the variable separator `:`
and automatically after the `#prompt` variable has been autocompleted.

* Resolve functions after variables in prompt service

Functions must not be immediately resolved in prompt fragments because function objects are lost this way.
Instead, they are left in and resolved when the final chat message or prompt containing a prompt fragment is resolved.
For prompts, all variables must be resolved before resolving functions to allow resolving functions in resolved variables.

- Resolve functions after variables when getting a prompt
- Extend PromptService with getPromptFragment method that resolves variables but not functions
- ChatRequestParser can now handle chat and prompt function formats
- Add unit test for prompt service testing resolving a function within a variable

* Resolve functions in resolved variables in chat request parser

- Move variable resolvement from chat service to chat request parser
- Resolve functions in resolved variable texts
- Add unit test for this

* Support explicit AI variable dependencies and recursive resolvement

- Add interface AIVariableResolverWithVariableDependencies for variable resolvers that want to resolve dependencies.
  The variable service hands in a resolve method to these resolvers.
- The AI variable service now recursively resolves all variables while providing cycle detection.
  If a cycle is found, the resolvement of only the recursive branch is stopped.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
theia-ai issues related to TheiaAI
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

Support AI variable dependencies explicitly [Theia AI] Support prompt fragments
4 participants