10000 Some QOL config/saving improvements by francoishernandez · Pull Request #134 · eole-nlp/eole · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Some QOL config/saving improvements #134

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 25, 2024

Conversation

francoishernandez
Copy link
Member
  1. Previously, the transforms artifacts were saved only in the "root" model_path, so not accessible by default within the step_# checkpoints. This PR applies the same logic as all the other items to the transform artifacts : save the file in the step_dir, and create a symlink to the latest in the root dir.
  2. To prevent validation issues, when finetuning a model with the "inference" key in its config.json, this key was popped prior to instantiating the TrainConfig. That prevents the downstream transparent use of these inference settings. This PR adds an explicit inference field to TrainConfig, which allows to properly retain it all along. It will also allow to modify the validation/inference codepath to use the "proper" settings rather than relying on defaults.

@francoishernandez francoishernandez merged commit 0ec1088 into main Oct 25, 2024
4 checks passed
francoishernandez added a commit that referenced this pull request Dec 4, 2024
@francoishernandez francoishernandez deleted the some_qol_config_improvements branch February 7, 2025 08:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant
0