Skip to content

Releases: Dao-AILab/flash-attention

v2.8.3

14 Aug 17:12
Compare
Choose a tag to compare
Bump to v2.8.3

v2.8.2

24 Jul 05:45
Compare
Choose a tag to compare
Bump to v2.8.2

v2.8.1

09 Jul 18:34
Compare
Choose a tag to compare
Bump to v2.8.1

v2.8.0.post2

14 Jun 15:40
Compare
Choose a tag to compare
[CI] Build with NVCC_THREADS=2 to avoid OOM

v2.8.0.post1

14 Jun 12:53
Compare
Choose a tag to compare
[CI] Compile with ubuntu-22.04 instead of ubuntu-20.04

v2.8.0

14 Jun 05:39
Compare
Choose a tag to compare
Bump to v2.8.0

v2.7.4.post1

29 Jan 21:43
Compare
Choose a tag to compare
Drop Pytorch 2.1

v2.7.4

29 Jan 21:34
Compare
Choose a tag to compare
Bump to v2.7.4

v2.7.3

10 Jan 18:01
89c5a7d
Compare
Choose a tag to compare
Change version to 2.7.3 (#1437)

Signed-off-by: Kirthi Shankar Sivamani <ksivamani@nvidia.com>

v2.7.2.post1

07 Dec 18:43
Compare
Choose a tag to compare
[CI] Use MAX_JOBS=1 with nvcc 12.3, don't need OLD_GENERATOR_PATH