8000
We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
make sure amax for numerical stability is detached
fix axis for torch max used for numerical stability when deriving key… … prime
update example
remove pytorch-fast-transformers from dep
add a handy method for ProjectionUpdater
expose projection update logic, for easy use in alphafold2 project
expose cross attention, in ready to use Performer in alphafold2 repo
allow for users to turn off bias for attention out projections