-
-
Notifications
You must be signed in to change notification settings - Fork 995
Open
Labels
enhancementhelp wantedIssues suitable for, and inviting external contributionsIssues suitable for, and inviting external contributions
Description
Issue Description
Better support for mixed precision training would be extremely helpful, at least for SVI. I can manually cast data into float16
or bfloat16
but I am unable to leverage PyTorch's automatic mixed precision training. This is because it requires the use of the GradScaler
class during the optimization loop to properly scale gradients in a mixed-precision-aware manner. See the documentation for more info: https://pytorch.org/docs/stable/amp.html
It would be nice to have support for using this class within pyro optimizers to allow for amp support.
Metadata
Metadata
Assignees
Labels
enhancementhelp wantedIssues suitable for, and inviting external contributionsIssues suitable for, and inviting external contributions