8000 333 add torch compile uni test by fromm-m · Pull Request #344 · Modalities/modalities · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

333 add torch compile uni test #344

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Apr 14, 2025

Conversation

fromm-m
Copy link
Member
@fromm-m fromm-m commented Apr 10, 2025

What does this PR do?

This PR adds test for torch compile and slightly adjusts get_compiled_model to skip module_classes not found.

General Changes

  • ..

Breaking Changes

  • ..

Checklist before submitting final PR

  • My PR is minimal and addresses one issue in isolation
  • I have merged the latest version of the target branch into this feature branch
  • I have reviewed my own code w.r.t. correct implementation, missing type hints, proper documentation, etc.
  • I have run a sample config for model training
  • I have checked that all tests run through (python tests/tests.py)
  • I have updated the internal changelog (CHANGELOG_DEV.md)

@fromm-m fromm-m requested review from le1nux and mali-git April 10, 2025 15:49
@fromm-m fromm-m linked an issue Apr 10, 2025 that may be closed by this pull request
@fromm-m fromm-m changed the base branch from main to fsdp2_min_integration April 10, 2025 15:51
@fromm-m fromm-m self-assigned this Apr 10, 2025
Copy link
Member
@le1nux le1nux left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work! Left a few minor remarks.

fromm-m and others added 2 commits April 11, 2025 12:15
Co-authored-by: Max Lübbering <2804731+le1nux@users.noreply.github.com>
@fromm-m fromm-m requested a review from le1nux April 11, 2025 11:32
Copy link
Member
@le1nux le1nux left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left two minor comments.

@@ -50,7 +51,7 @@ def gpt2_model():
attention_norm_config=norm_config,
ffn_norm_config=norm_config,
lm_head_norm_config=norm_config,
use_weight_tying=False,
use_weight_tying=True,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does torch.compile support weight tying now?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tests run through, as I understand it, compile supports weight tying (and other techniques), they are just not as "performant" as they introduce graph breaks

Co-authored-by: Max Lübbering <2804731+le1nux@users.noreply.github.com>
@le1nux le1nux self-requested a review April 14, 2025 23:53
@le1nux le1nux merged commit ef2f8c2 into fsdp2_min_integration Apr 14, 2025
4 checks passed
@le1nux le1nux deleted the 333-add-torch-compile-uni-test branch April 14, 2025 23:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to commen 6A7B t
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add torch compile uni test
2 participants
0