如何从 Hugging face 二元分类模型中解释 logit 分数并将其转换为概率排序

How to interpret logit score from Hugging face binary classification model and convert it to probability sore

我正在下载模型 https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384/tree/main microsoft/Multilingual-MiniLM-L12-H384 然后使用它。我正在使用 BertForSequenceClassification

加载模型

https://huggingface.co/docs/transformers/model_doc/bert#:~:text=sentence%20was%20random-,BertForSequenceClassification,-class%20transformers.BertForSequenceClassification

变形金刚版本:'4.11.3'

我写了下面的代码:

def compute_metrics(eval_pred):
    logits, labels = eval_pred
   

    predictions = np.argmax(logits, axis=-1)
    
    acc = np.sum(predictions == labels) / predictions.shape[0]
    return {"accuracy" : acc}

model = tr.BertForSequenceClassification.from_pretrained("/home/pc/minilm_model",num_labels=2)
model.to(device)

print("hello")

training_args = tr.TrainingArguments(
    output_dir='/home/pc/proj/results2',          # output directory
    num_train_epochs=10,              # total number of training epochs
    per_device_train_batch_size=16,  # batch size per device during training
    per_device_eval_batch_size=32,   # batch size for evaluation
    learning_rate=2e-5,
    warmup_steps=1000,                # number of warmup steps for learning rate scheduler
    weight_decay=0.01,               # strength of weight decay
    logging_dir='./logs',            # directory for storing logs
    logging_steps=1000,
    evaluation_strategy="epoch",
    save_strategy="no"
)



trainer = tr.Trainer(
    model=model,                         # the instantiated  Transformers model to be trained
    args=training_args,                  # training arguments, defined above
    train_dataset=train_data,         # training dataset
    eval_dataset=val_data,             # evaluation dataset
    compute_metrics=compute_metrics
)

训练模型后文件夹为空

二分类可以传类=2吗?

模型最后一层是简单的线性连接,给出了logits值。如何从中得到它的解释和概率分数? logit分数是否与概率成正比?

model = tr.BertForSequenceClassification.from_pretrained("/home/pchhapolika/minilm_model",num_labels=2)

Is it okay to pass classes=2 for binary classification?

是的。

The model last layer is simple linear connection which gives logits value. How to get its interpretation and probability score out of it? Does logit score is directly proportional to probability.?

它们之间有直接关系:

probability = softmax(logits, axis=-1)

反之亦然: logits = log(probability) + const

所以logits与概率不成正比,但关系是单调的。