8000 Model finetuning · Issue #477 · pytorch/opacus · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Model finetuning #477

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
timudk opened this issue Aug 24, 2022 · 2 comments
Closed

Model finetuning #477

timudk opened this issue Aug 24, 2022 · 2 comments
Labels
good first issue Good for newcomers

Comments

@timudk
Copy link
timudk commented Aug 24, 2022

🚀 Feature

It would be nice to be able to only fine-tune a subset of the model parameters with Opacus. Currently this is not possible since there is a check that module parameters = optimizer parameters (

# compare module parameter with optimizer parameters
)

@timudk
Copy link
Author
timudk commented Aug 26, 2022

I am actually curious about what's the purpose of this check in the first place.

@karthikprasad karthikprasad added the good first issue Good for newcomers label Aug 29, 2022
@karthikprasad
Copy link
Contributor

Hi @timudk , the check was motivated by the issue described in #432
Your usecase makes sense, though. I think ensuring trainable parameters are same between model and optimizer rather than all parameters addresses this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants
0