jp6/cu128/: flash-attn versions

Simple index

Because this project isn't in the mirror_whitelist, no releases from root/pypi are included.

Latest version on stage is: 2.7.4.post1

Flash Attention: Fast and Memory-Efficient Exact Attention

Index Version Documentation
jp6/cu128 2.7.4.post1
jp6/cu128 2.7.2.post1