-
-
Notifications
You must be signed in to change notification settings - Fork 3
Partial Ollama support #49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds partial support for Ollama and improves tool argument parsing to better handle invalid inputs for tool calls.
- Updated tests and configuration to include CONFIG_VERSION and new provider details.
- Added new LLMProviderConfig entries for Ollama support (currently disabled) and updated provider lookup.
- Enhanced tool argument parsing logic in conversation handling to repair minor JSON formatting issues.
Reviewed Changes
Copilot reviewed 6 out of 7 changed files in this pull request and generated 1 comment.
Show a summary per file
File | Description |
---|---|
e2e_tests/test_basic_conversation.py | Updated test configuration to include CONFIG_VERSION. |
e2e_tests/conftest.py | Modified allowed hosts and updated default model strings, added CONFIG_VERSION. |
docs/supported-providers.md | Revised documentation to list supported providers including Ollama. |
custom_components/custom_conversation/providers.py | Introduced new provider configurations for Ollama; commented out enabled entries. |
custom_components/custom_conversation/conversation.py | Added argument parsing utility functions and updated model formatting for LLM completion. |
Files not reviewed (1)
- .vscode/settings.json: Language not supported
Codecov ReportAttention: Patch coverage is
✅ All tests successful. No failed tests found.
Additional details and impacted files@@ Coverage Diff @@
## main #49 +/- ##
==========================================
- Coverage 47.90% 47.44% -0.47%
==========================================
Files 10 10
Lines 1075 1096 +21
Branches 163 166 +3
==========================================
+ Hits 515 520 +5
- Misses 530 546 +16
Partials 30 30 ☔ View full report in Codecov by Sentry. |
This PR add some support for Ollama. Native support is currently disabled until these two issues have been resolved:
BerriAI/litellm#6135
BerriAI/litellm#9602
This does, however, add some tool argument parsing fixes that improve tool calling ability through Ollama's OpenAI API.