Open
Description
🐛 Describe the bug
In PyTorch <2.2.0, SDPA (scaled_dot_product_attention) supports Flash Attention v1 on Windows. In PyTorch 2.2.0>=, it does not support any Flash Attention on Windows.
Versions
This is a report of a regression between 2.1.2 and 2.2.0+
cc @peterjc123 @mszhanyi @skyline75489 @nbcsm @vladimir-aubrecht @iremyux @Blackhex @cristianPanaite
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
Blocked