Skip to content

AMD LayerNorm Seg Fault in PyTorch #437

@xw285cornell

Description

@xw285cornell

❓ The question

Hi, i'm from the PyTorch team and I'm recently aware that we need some customization in layer norm, because it'll seg fault without bias:

class AMDLayerNorm(LayerNormBase):
. I wonder if this is already resolved? I'm trying this repro on the current pytorch and it seems to run just fine:

import torch
assert torch.version.hip is not None
input = torch.randn(10, 10, 10).cuda()
ln = torch.nn.LayerNorm([10, 10], bias=False).cuda()
ln(input).sum().backward()
print(ln.weight.grad)
assert ln.bias is None

Metadata

Metadata

Assignees

No one assigned

    Labels

    type/questionAn issue that's a question

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions