8000 The mean of fill_up_weights · Issue #41 · fyu/drn · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

The mean of fill_up_weights #41

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
UpCoder opened this issue Jan 22, 2019 · 2 comments
Open

The mean of fill_up_weights #41

UpCoder opened this issue Jan 22, 2019 · 2 comments

Comments

@UpCoder
Copy link
UpCoder commented Jan 22, 2019

Hi, I am reading the code. And I am wondering what's the mean of the function----fill_up_weights.
The code is here.
It seems that use the fill_up_weight to init the parameter of ConvTranspose2d. However, the parameter of ConvTranspose2d maybe not update in the training process. So why do freeze the weight of ConvTranspose2d?

@fyu
Copy link
Owner
fyu commented Jan 22, 2019

The code is for bilinear upsampling. In the early versions of pytorch, upsampling is aligned to the corners, which will actually cause pixel misalignment. Therefore, I made my own version of bilinear upsampling by using "fill_up_weights" and "ConvTranspose2d". It has been resolved in the current pytorch. PyTorch documentation also has some explanation: https://pytorch.org/docs/stable/nn.html?highlight=bilinear#torch.nn.Upsample.

@UpCoder
Copy link
Author
UpCoder commented Jan 22, 2019

So, the weight which named as up.weight is useless? In the inference state, could be the ConvTranspose2d operation replaced with the resize operation with the bilinear kernel?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants
0