AttributeError: 'str' object has no attribute 'shape' while encoding tensor using BertModel with PyTorch (Hugging Face)

AttributeError: 'str' object has no attribute 'shape' while encoding tensor using BertModel with PyTorch (Hugging Face)

AttributeError:'str' 对象没有属性 'shape',同时使用 BertModel 和 PyTorch(拥抱面)对张量进行编码。下面是代码

bert_model = BertModel.from_pretrained(r'downloads\bert-pretrained-model')
input_ids

输出为:

tensor([[  101,   156, 13329,  ...,     0,     0,     0],
        [  101,   156, 13329,  ...,     0,     0,     0],
        [  101,  1302,  1251,  ...,     0,     0,     0],
        ...,
        [  101, 25456,  1200,  ...,     0,     0,     0],
        [  101,   143,  9664,  ...,     0,     0,     0],
        [  101,  2586,  7340,  ...,     0,     0,     0]])

后跟下面的代码

last_hidden_state, pooled_output = bert_model(
  input_ids=encoding['input_ids'],
  attention_mask=encoding['attention_mask']
)

后跟下面的代码

last_hidden_state.shape

输出是

AttributeError                            Traceback (most recent call last)
<ipython-input-70-9628339f425d> in <module>
----> 1 last_hidden_state.shape

AttributeError: 'str' object has no attribute 'shape'

完整代码 link 是'https://colab.research.google.com/drive/1FY4WtqCi2CQ9RjHj4slZwtdMhwaWv2-2?usp=sharing'

问题是 return type has changed 自 3.xx 版本的变形金刚。所以,我们明确要求一个张量元组。

因此,我们可以在调用 bert_model() 时传递一个额外的 kwarg return_dict = False 以获得对应于 [=14= 的 实际张量 ].

last_hidden_state, pooled_output = bert_model(
  input_ids=encoding['input_ids'],
  attention_mask=encoding['attention_mask'],
  return_dict = False   # this is needed to get a tensor as result
)

如果您不喜欢前面的方法,那么您可以求助于:

In [13]: bm = bert_model(
    ...:   encoding_sample['input_ids'],
    ...:   encoding_sample['attention_mask']
    ...: )

In [14]: bm.keys()
Out[14]: odict_keys(['last_hidden_state', 'pooler_output'])

# accessing last_hidden_state 
In [15]: bm['last_hidden_state']

In [16]: bm['last_hidden_state'].shape
Out[16]: torch.Size([1, 17, 768])