8000 Why ERF is using relu to cut off negative grad? · Issue #62 · DingXiaoH/RepLKNet-pytorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
Why ERF is using relu to cut off negative grad? #62
Open
@hachreak

Description

@hachreak

Dear @DingXiaoH ,
thank you very much for your contribution and to publish the code online.
This is really helpful also for my research.
I was trying to use your code to generate the ERF image.

I have a question about the implementation:
Why a ReLU is used inside the get_input_grad function?
Are the negative grad not contributing to the final ERF image?
Should be more correct to consider all absolute grad values?

Thanks a lot for you reply! 😄
Leo

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0