8000 希望支持llama-2-7B的base模型 · Issue #27 · wangzhaode/llm-export · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

希望支持llama-2-7B的base模型 #27

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
chenxijun1029 opened this issue Jan 31, 2024 · 0 comments
Open

希望支持llama-2-7B的base模型 #27

chenxijun1029 opened this issue Jan 31, 2024 · 0 comments
Assignees

Comments

@chenxijun1029
Copy link
chenxijun1029 commented Jan 31, 2024

转换llama-2-7B的base模型,type选择Llama-2-7b-chat-ms时报错:
Traceback (most recent call last): File "llm_export.py", line 1187, in <module> llm_exporter = llm_models[model_type](args) File "llm_export.py", line 815, in __init__ super().__init__(args) File "llm_export.py", line 102, in __init__ self.load_model(args.path) File "llm_export.py", line 819, in load_model model_path, trust_remote_code=True) File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/tokenization_auto.py", line 635, in from_pretrained pretrained_model_name_or_path, trust_remote_code=trust_remote_code, **kwargs File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 917, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 623, in __getitem__ raise KeyError(key) KeyError: 'llama'

@wangzhaode wangzhaode self-assigned this Feb 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants
0