Skip to content

[feature request] support double backward for cudnn RNNs #5261

@ssnl

Description

@ssnl

Now only autograd RNN supports double backwards. We should make cudnn RNN work as well.

cc @ezyang @ssnl @albanD @zou3519 @gqchen @csarofeen @ptrblck

Metadata

Metadata

Assignees

No one assigned

    Labels

    actionableenhancementNot as big of a feature, but technically not a bug. Should be easy to fixfixathonmodule: autogradRelated to torch.autograd, and the autograd engine in generalmodule: cudnnRelated to torch.backends.cudnn, and CuDNN supportmodule: double backwardsProblem is related to double backwards definition on an operatormodule: rnnIssues related to RNN support (LSTM, GRU, etc)triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions