-
Notifications
You must be signed in to change notification settings - Fork 25.2k
Closed
Description
As reported on slack:
Can somebody explain the behavior of flip
to me? I would expect that t.flip([a, b])
is a shorthand for t.flip(a).flip(b)
, but this is not the case (see below). Does the tuple variant flip along a diagonal or what?
In [12]: t = torch.arange(3*4*5).reshape(1, 3, 4, 5)
In [13]: t.flip([1, 3]).shape
Out[13]: torch.Size([5, 4, 3, 1])
In [14]: t.flip(1).flip(3).shape
Out[14]: torch.Size([1, 3, 4, 5])
ngimel says:
Looks like a bug. FWIW, flip on cuda tensor is not changing shape. There's some logic about permuting in the flip on cpu, that I don't quite get, may be it is to blame.
https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/native/TensorTransformations.cpp#L33-L54
Metadata
Metadata
Assignees
Labels
No labels