Skip to content

Conversation

borzunov
Copy link
Collaborator

@borzunov borzunov commented Aug 5, 2023

See the "Running on AMD GPUs" tutorial to learn how to use this branch.

This branch supports bitsandbytes < 0.40.0, so that AMD GPUs owners can use bitandbytes-rocm ports from bitsandbytes-foundation/bitsandbytes#47

To make it possible, we remove features depending on bitsandbytes >= 0.40.0 (NF4 and LoRA adapters). In case of the LoRA adapters, they depend on the peft library that requires bitsandbytes >= 0.40.0.

@borzunov borzunov changed the title [don't merge] Branch for AMD GPUs (removes LoRA adapter support) [don't merge] Branch for AMD GPUs (with older bitsandbytes) Aug 5, 2023
@PaulConyngham
Copy link

would be awesome to see this integrated

@borzunov
Copy link
Collaborator Author

borzunov commented Aug 6, 2023

Hi @PaulConyngham,

You already can run a server on AMD GPUs by following this tutorial: https://github.com/bigscience-workshop/petals/wiki/Running-on-AMD-GPU

We hope that someone makes a port of recent bitsandbytes to AMD/ROCM soon and we won't need this branch anymore (instead, you'd be able to run the main Petals version there).

@borzunov
Copy link
Collaborator Author

Not needed anymore, now the Petals main branch runs on AMD GPUs (even with NF4): https://github.com/bigscience-workshop/petals/wiki/Running-on-AMD-GPU

This is thanks to this bitsandbytes ROCM-compatible fork: https://github.com/arlo-phoenix/bitsandbytes-rocm-5.6

@borzunov borzunov closed this Aug 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants