8000 add head_dim setting when diff from hidden // heads by vince62s · Pull Request #78 · eole-nlp/eole · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

add head_dim setting when diff from hidden // heads #78

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Aug 30, 2024

Conversation

vince62s
Copy link
Contributor
@vince62s vince62s commented Jul 22, 2024

This is need for MistralNemo

Also these papers refer to a 6/6 layers 16 heads 1024 hidden 8192 ffn BUT with 128 head dim (which is different from 1024//16):
[I had hard time to find out why 551M params for their transformer)
https://arxiv.org/pdf/2311.05350
https://arxiv.org/pdf/2310.06707

@vince62s vince62s merged commit 7a4757f into eole-nlp:main Aug 30, 2024
2 checks passed
@vince62s vince62s deleted the headdim branch January 13, 2025 08:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0