You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
转换llama-2-7B的base模型,type选择Llama-2-7b-chat-ms时报错: Traceback (most recent call last): File "llm_export.py", line 1187, in <module> llm_exporter = llm_models[model_type](args) File "llm_export.py", line 815, in __init__ super().__init__(args) File "llm_export.py", line 102, in __init__ self.load_model(args.path) File "llm_export.py", line 819, in load_model model_path, trust_remote_code=True) File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/tokenization_auto.py", line 635, in from_pretrained pretrained_model_name_or_path, trust_remote_code=trust_remote_code, **kwargs File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 917, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 623, in __getitem__ raise KeyError(key) KeyError: 'llama'
The text was updated successfully, but these errors were encountered:
Uh oh!
There was an error while loading. Please reload this page.
转换llama-2-7B的base模型,type选择Llama-2-7b-chat-ms时报错:
Traceback (most recent call last): File "llm_export.py", line 1187, in <module> llm_exporter = llm_models[model_type](args) File "llm_export.py", line 815, in __init__ super().__init__(args) File "llm_export.py", line 102, in __init__ self.load_model(args.path) File "llm_export.py", line 819, in load_model model_path, trust_remote_code=True) File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/tokenization_auto.py", line 635, in from_pretrained pretrained_model_name_or_path, trust_remote_code=trust_remote_code, **kwargs File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 917, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/home/env/anaconda3/envs/nio/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 623, in __getitem__ raise KeyError(key) KeyError: 'llama'
The text was updated successfully, but these errors were encountered: