-
Notifications
You must be signed in to change notification settings - Fork 2.2k
[SFT VLM] Added support for Molmo models via standalone script sft_vlm_molmo
#2236
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
HI @sergiopaniego , thanks for impementing this. Could you run We are discussing internally how feasible it is to hormonize this script with the other VLM training scripts, I will let you know when we have a conclusion. |
Updated! Any updates on the harmonization discussion? I’m happy to make any modifications needed! 😊 |
@sergiopaniego so is this working in theory? Also OOM'ing for me needs 50 GB and my A100 only has like 40 GB or something. Is there a level I can pull to decrease the memory? Why does it need so much considering it is doing a LORA? Is it possible to set this up to train on multiple GPUs? |
Sorry for the late response @mshuffett. It still needs some polishing. While testing it, it seems like something is still missing from the artifacts for the model shared. You can see more details about it in the README. For example, since the |
In case anybody is looking for an updated script, I've some resources 😄 Since the transformers PR is close to being merged, these are the resources:
|
No recent activity on this branch for more than a few months, I'm closing this PR. Please feel free to reopen a PR if there is new activity. |
What does this PR do?
Fixes #2136.
This PR presents a standalone version for adding support to Molmo models. It may benefit from a generalization to be compatible with sft_vlm.py
This notebook has a reproducible version, both running the script or using code directly.
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines.
Who can review?
@lewtun @edbeeching @qgallouedec