Skip to content

Allow batch size 1 by default #241

@Dolyfin

Description

@Dolyfin

Is your feature request related to a problem? Please describe.
Finetuning on RTX 3060 12GB with 20s length. Can only fit into VRAM with 1 Batch size. Using Grad 16 to counteract this and now fitting with 10.9GB VRAM usage.

Describe the solution you'd like
Just edit the min value in finetune.py

Additional context
I'm just lazy to go edit the values every install/update and would help more people with less VRAM to finetune. Add explanation in 'info - batch size' to turn up grad accumulation if batch size is low for optimal training.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions