这就是我尝试运行的所有代码:
from transformers import AutoModelWithLMHead, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")我得到了这个错误:
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-14-aad2e7a08a74> in <module>
----> 1 from transformers import AutoModelWithLMHead, AutoTokenizer
2 import torch
3
4 tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
5 model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' (c:\python38\lib\site-packages\transformers\__init__.py)我该怎么做呢?
发布于 2020-07-31 00:53:18
我解决了!很明显,在我的版本中删除了AutoModelWithLMHead。
现在,您需要为因果语言模型使用AutoModelForCausalLM,为屏蔽语言模型使用AutoModelForMaskedLM,为编码器-解码器模型使用AutoModelForSeq2SeqLM。
因此,在我的例子中,代码如下所示:
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")https://stackoverflow.com/questions/63141267
复制相似问题