从huggingface加载模型时出错

1 投票
1 回答
34 浏览
提问于 2025-04-14 15:55

我在尝试从 Hugging Face 运行 Autoformer 时遇到了这个错误。我已经安装了最新版本的 transformers。

KeyError                                  Traceback (most recent call last)
Cell In[2], line 6
  3 # Load model directly
  4 from transformers import AutoTokenizer, AutoformerForPrediction
  6 tokenizer = AutoTokenizer.from_pretrained("huggingface/autoformer-tourism- 
    monthly")
  7 model = AutoformerForPrediction.from_pretrained("huggingface/autoformer-tourism- 
    monthly")

File ~/anaconda3/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py:841, in AutoTokenizer.from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs)
839 model_type = config_class_to_model_type(type(config).__name__)
840 if model_type is not None:
841     tokenizer_class_py, tokenizer_class_fast = TOKENIZER_MAPPING[type(config)]
842     if tokenizer_class_fast and (use_fast or tokenizer_class_py is None):
843         return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)

File ~/anaconda3/lib/python3.11/site- 
packages/transformers/models/auto/auto_factory.py:740, in _LazyAutoMapping.__getitem__(self, key)
738         model_name = self._model_mapping[mtype]
739         return self._load_attr_from_module(mtype, model_name)
740 raise KeyError(key)

KeyError: <class 'transformers.models.autoformer.configuration_autoformer.AutoformerConfig'>nter code here

我运行的代码是:

# Load model directly
from transformers import AutoTokenizer, AutoformerForPrediction

tokenizer = AutoTokenizer.from_pretrained("huggingface/autoformer-tourism-monthly")
model = AutoformerForPrediction.from_pretrained("huggingface/autoformer-tourism-monthly")

这是 Hugging Face 的链接

1 个回答

暂无回答

撰写回答