-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Add support for OpenAI API compatible models #452
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything work fine ! However they are some "too" easy prompt injection
@@ -4,6 +4,7 @@ import { trimSuffix } from "$lib/utils/trimSuffix"; | |||
import { trimPrefix } from "$lib/utils/trimPrefix"; | |||
import { PUBLIC_SEP_TOKEN } from "$lib/constants/publicSepToken"; | |||
import { AwsClient } from "aws4fetch"; | |||
import OpenAI from "openai"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In my PR I did try to avoid using the openai package as it was used only for a few type and that the rest api work well. In your implementation adding the package make more sense I think
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Been trying this and with this config:
inference seems to get stuck. Once I shut down the node server and restart it, it will run inference on the last text issued by the user: If I wait long enough, I get this error:
|
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
I tried the same setting, but can't reproduce, there seems exceptions when calling |
@nsarrazin @philschmid @mishig25 PTAL. |
Sorry I haven't been able to replicate it. |
Trying to use the LiteLLM-Proxy and Together.ai model:
The LiteLLM endpoint (http://localhost:8081/chat/completions) never gets called, and I see this error message in the browser console:
|
is there anything blocking this PR from being merged? |
is there anything blocking this PR from being merged? |
I have not received any comments from the maintainers for over a month, it seems they have no interest or there are conflicts of interests to support OpenAI-styled API. For those interests in using chat-ui with OpenAI-styled API, please follow the forked version that tried to sync with the upstream . https://github.com/ialacol/chat-ui/tree/main |
Humm I don't think they have any conflict of interest ^^. I think they just lacking of time to review it or maybe just forgot. For me this is looking fine, it's okay to merge. |
Hey guys! No conflicts of interest, as far as I know we're pretty happy to make chat-ui backend agnostic. I just wanted to take the time to refactor things a bit so that we could add more backends easily in the future, in a way that scales. I wanted to use dynamic imports so that not everyone needs to install the I just didn't have the time to design a common backend API I was happy with, but seems like it is a pressing issue for some so I'll have a look soon 😁 |
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Signed-off-by: Hung-Han (Henry) Chen <chenhungh@gmail.com>
Hi everyone! Quick update on the support for openAI type endpoints. I finished my refactoring on the pr #541, I tested it using the OpenAI API and it worked well. I'll be testing it more thoroughly and updating the docs but feel free to do it as well with your locally hosted APIs to let me know if it works 😄 Thanks @chenhunghan for the great work on this! |
Many popular open-sourced project offers OpenAI API compatible endpoints, for example
This PR adds OpenAI API endpoint compatibility, if merge,
chat-ui
can be used as the chat user interface for any project mentioned above. The official OpenAI endpoints are also support, user can quickly evaluate GPT3/GPT4 and/or any OSS models (GGUF/GPTQ...) hosted by for example llama.cpp.The difference between this PR and #443 is