Pytorch 期望每个张量大小相等

Pytorch expects each tensor to be equal size

当运行此代码时:embedding_matrix = torch.stack(embeddings)

我收到这个错误:

RuntimeError: stack expects each tensor to be equal size, but got [7, 768] at entry 0 and [8, 768] at entry 1

我正在尝试通过以下方式使用 BERT 进行嵌入:

    split_sent = sent.split()
    tokens_embedding = []
    j = 0
    for full_token in split_sent:
        curr_token = ''
        x = 0
        for i,_ in enumerate(tokenized_sent[1:]): 
            token = tokenized_sent[i+j]
            piece_embedding = bert_embedding[i+j]
            if token == full_token and curr_token == '' :
               tokens_embedding.append(piece_embedding)
               j += 1
               break                                     
    sent_embedding = torch.stack(tokens_embedding)
    embeddings.append(sent_embedding)
embedding_matrix = torch.stack(embeddings)

有谁知道我该如何解决这个问题?

根据 PyTorch Docs about torch.stack() function, it needs the input tensors in the same shape to stack. I don't know how will you be using the embedding_matrix but either you can add padding to your tensors (which will be a list of zeros at the end till a certain user-defined length and is recommended if you will train with this stacked tensor, refer this tutorial) 使它们等维,或者您可以简单地使用 torch.cat(data,dim=0).