8000 2.2.0+ regresses SDPA performance on Windows · Issue #125070 · pytorch/pytorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
2.2.0+ regresses SDPA performance on Windows  #125070
Open
@Xemorr

Description

@Xemorr

🐛 Describe the bug

In PyTorch <2.2.0, SDPA (scaled_dot_product_attention) supports Flash Attention v1 on Windows. In PyTorch 2.2.0>=, it does not support any Flash Attention on Windows.

Versions

This is a report of a regression between 2.1.2 and 2.2.0+

cc @peterjc123 @mszhanyi @skyline75489 @nbcsm @vladimir-aubrecht @iremyux @Blackhex @cristianPanaite

Metadata

Metadata

Assignees

Labels

module: sdpaAll things related to torch.nn.functional.scaled_dot_product_attentiionmodule: windowsWindows support for PyTorchtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

Status

Blocked

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions

    0