Skip to content

Conversation

chaunceyjiang
Copy link
Collaborator

@chaunceyjiang chaunceyjiang commented Apr 22, 2025

MinIO is an object storage service compatible with the S3 protocol.
It is commonly used by setting the environment variables AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_ENDPOINT_URL.

For example: https://github.com/run-ai/runai-model-streamer/blob/master/cpp/s3/s3.h#L31

Add S3 environment variables for better support of MinIO.

Signed-off-by: chaunceyjiang <chaunceyjiang@gmail.com>
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@chaunceyjiang
Copy link
Collaborator Author

test

# export AWS_ENDPOINT_URL=http://10.20.2.192:9000
# export AWS_SECRET_ACCESS_KEY=xxxxx
# export AWS_ACCESS_KEY_ID=xx

# vllm serve s3://llm/meta-llama/Meta-Llama-3-8B-Instruct --load-format runai_streamer
INFO 04-22 11:13:25 [__init__.py:239] Automatically detected platform cuda
....

@chaunceyjiang
Copy link
Collaborator Author

/cc @DarkLight1337 PTAL.

@DarkLight1337
Copy link
Member

Is this just a code convention? Or is there official documentation for this?

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In any case this is reasonable

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) April 22, 2025 12:42
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Apr 22, 2025
@DarkLight1337 DarkLight1337 merged commit 68d4c33 into vllm-project:main Apr 22, 2025
59 of 62 checks passed
@vrdn-23
Copy link
Contributor

vrdn-23 commented Apr 22, 2025

I'm a little confused why this change is necessary? From my understanding, boto3 should automatically pick up all the variables automatically without having to do this.

Is there any reason why this doesn't work as-is?
cc @chaunceyjiang @DarkLight1337

@chaunceyjiang
Copy link
Collaborator Author

@vrdn-23 Indeed.

I double-checked the documentation at https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html, and you're right—Boto3 does support those environment variables by default.

I had always assumed it would only read from ~/.aws/config and wouldn't read the environment variables by default. 😂

@chaunceyjiang
Copy link
Collaborator Author

Is there any reason why this doesn't work as-is?

Yesterday, I set up a MinIO instance myself to test #16926. Then I modified ~/.aws/config for testing.

The minio-py client I commonly use supports those environment variables by default, so in vLLM, I explicitly added the variable as well. 😂

@chaunceyjiang
Copy link
Collaborator Author

@DarkLight1337 @vrdn-23 For this PR, should we keep the current behavior—explicitly reading the environment variables—or should we revert to the previous behavior—implicitly reading the variables?

@vrdn-23
Copy link
Contributor

vrdn-23 commented Apr 23, 2025

I would err on the side of reverting, just because I believe boto3 has an implicit way of deciding which order to read AWS credentials from (eg: it checks env variables first, then client arguments and so on). Explicitly passing in the env variables through the client-side arguments will probably lead to un-intended consequences in the future if that order ever changes.

@DarkLight1337
Copy link
Member

Let's just revert this then

@chaunceyjiang
Copy link
Collaborator Author

I would err on the side of reverting, just because I believe boto3 has an implicit way of deciding which order to read AWS credentials from (eg: it checks env variables first, then client arguments and so on). Explicitly passing in the env variables through the client-side arguments will probably lead to un-intended consequences in the future if that order ever changes.

ok.

frieda-huang pushed a commit to frieda-huang/vllm that referenced this pull request Apr 23, 2025
…m-project#16977)

Signed-off-by: chaunceyjiang <chaunceyjiang@gmail.com>
Signed-off-by: Frieda (Jingying) Huang <jingyingfhuang@gmail.com>
jikunshang pushed a commit to jikunshang/vllm that referenced this pull request Apr 29, 2025
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
adobrzyn pushed a commit to HabanaAI/vllm-fork that referenced this pull request Apr 30, 2025
…m-project#16977)

Signed-off-by: chaunceyjiang <chaunceyjiang@gmail.com>
Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
…m-project#16977)

Signed-off-by: chaunceyjiang <chaunceyjiang@gmail.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
minpeter pushed a commit to minpeter/vllm that referenced this pull request Jun 24, 2025
…m-project#16977)

Signed-off-by: chaunceyjiang <chaunceyjiang@gmail.com>
Signed-off-by: minpeter <kali2005611@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants