ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers'
ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers'
这就是我要尝试的所有代码 运行:
from transformers import AutoModelWithLMHead, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
我收到这个错误:
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-14-aad2e7a08a74> in <module>
----> 1 from transformers import AutoModelWithLMHead, AutoTokenizer
2 import torch
3
4 tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
5 model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' (c:\python38\lib\site-packages\transformers\__init__.py)
我该怎么办?
我解决了!显然 AutoModelWithLMHead 在我的版本中被删除了。
现在您需要对因果语言模型使用 AutoModelForCausalLM
,对屏蔽语言模型使用 AutoModelForMaskedLM
,对编码器-解码器模型使用 AutoModelForSeq2SeqLM
。
所以在我的例子中,代码如下所示:
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")
这就是我要尝试的所有代码 运行:
from transformers import AutoModelWithLMHead, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
我收到这个错误:
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-14-aad2e7a08a74> in <module>
----> 1 from transformers import AutoModelWithLMHead, AutoTokenizer
2 import torch
3
4 tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
5 model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' (c:\python38\lib\site-packages\transformers\__init__.py)
我该怎么办?
我解决了!显然 AutoModelWithLMHead 在我的版本中被删除了。
现在您需要对因果语言模型使用 AutoModelForCausalLM
,对屏蔽语言模型使用 AutoModelForMaskedLM
,对编码器-解码器模型使用 AutoModelForSeq2SeqLM
。
所以在我的例子中,代码如下所示:
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")