Skip to content

Conversation

eb8680
Copy link
Member

@eb8680 eb8680 commented Dec 20, 2022

Resolves #3166

PyTorch recently changed the name of the base class used in pyro.optim to identify and wrap the learning rate schedulers in torch.optim from _LRScheduler to LRScheduler, leading to a silent failure to create wrappers.

This PR adds a backwards-compatible check to the wrapping logic to handle newer PyTorch releases.

fritzo
fritzo previously approved these changes Dec 20, 2022
Copy link
Member

@fritzo fritzo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing this!

@fritzo
Copy link
Member

fritzo commented Dec 20, 2022

Could you make format and try again? Looks like black wants to simplify now that one line is shorter

@fritzo fritzo merged commit 19e32df into dev Jan 8, 2023
@simonangerbauer
Copy link

simonangerbauer commented Mar 15, 2023

Hi @fritzo,
any idea when this will be released?
PyTorch 2.0.0 has been released and therefore this change is already necessary to keep the LRScheduler wrappers compatible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

bug with OneCycleLR on Apple Silicone
3 participants