Skip to content

BFloat16 on cuda: add triu/tril support #101932

@Maykeye

Description

@Maykeye

🚀 The feature, motivation and pitch

Right now if you try to use torch.triu on bfloat16 tensor (I hit it when was training simple network with AMP) you'll get a error that triu is not support

In [8]: torch.__version__
Out[8]: '2.0.1+cu117'

In [9]: torch.arange(4).reshape(2,2).bfloat16().cuda().triu()
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
Cell In[9], line 1
----> 1 torch.arange(4).reshape(2,2).bfloat16().cuda().triu()

RuntimeError: "triu_tril_cuda_template" not implemented for 'BFloat16'

It would be nice to have it.

Alternatives

Can be replaced with multiplication against torch.ones.triu()

   self.register_buffer("triu0", torch.ones(n, n).triu()) # __init__
   ...
   y = x * self.triu0 # forward()
   

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions