Skip to content

Conversation

simcop2387
Copy link
Contributor

This should let any other compatible AI servers, proxies, or routers work mostly natively.

I've added the needed variables to the .env.sample file along with adding the organization and project fields to the setup so that it can be used for tracking/accounting later on if users want to do that.

I understand that LW is open source but also a paid service, not sure what you need from me as far as licensing goes to include this so I'll leave the following until I'm told what you need otherwise:

  1. I don't consider this to be all that significant of work that would merit being copyrightable in the first place, but understand that neither you nor I can really make that choice
  2. I am willing to release any claims needed (i.e. CLA or other agreements).

Incidentally I'm about to test this myself with Qwen3-4B locally after it finishes building. Let me know if there's any changes you'd like me to make since i'm only mildly familiar with typescript so I might not have done something all that idiomatically

@CLAassistant
Copy link

CLAassistant commented May 2, 2025

CLA assistant check
All committers have signed the CLA.

@simcop2387
Copy link
Contributor Author

CLA should be signed, and I've confirmed that it's working for me with Qwen3-4B with 32k context at ~100 tokens/sec :) doesn't always decide to create tags but that's all down to the model. Might be worth some customization of the prompt in the future.

@simcop2387 simcop2387 changed the base branch from main to dev May 2, 2025 13:10
@simcop2387
Copy link
Contributor Author

Set to the proper branch and merged from dev to clear the conflicts.

@daniel31x13
Copy link
Member

Hey thanks for the PR! Noticed there's an process.env.OPENAI_NAME environment variable, is that required to be changed for the third-party providers?

I'll be making some changes to make the config.compatibility dynamic based on the config.baseURL and merge this afterwards :)

@daniel31x13
Copy link
Member

Also wouldn't it be better if we remove the OPENAI_PROJECT and OPENAI_ORGANIZATION environment variables for minimal complexity to set it up? Will that cause any potential issues?

@daniel31x13 daniel31x13 merged commit 1da1f17 into linkwarden:dev May 3, 2025
3 checks passed
@simcop2387
Copy link
Contributor Author

@daniel31x13

As far as I could tell what I added for the OPENAI_NAME variable seems to be seems to be something related to the library itself related to compre atibility. Not entirely sure what it does in the end because the docs aren't all that helpful there.

https://ai-sdk.dev/providers/ai-sdk-providers/openai

Likely setting the compatibility setting dynamically based on if the base is changed is likely to work fine I would think given what it does have there.

The project and organization parameters are entirely optional and I added them while I was there because I know they can be useful for even the official API since they can get used for logging more fine grained billing and other stats for usage if desired.

@daniel31x13
Copy link
Member

I already used the @ai-sdk/openai-compatible package instead which now directly handles everything. Thanks for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants