有没有办法在keras中输出具有多个值的指标?

Is there a way to output a metric with several values in keras?

我正在研究 U-net 架构,以在 10 classes 中执行分割。我想在每个纪元之后计算每个 class 的骰子系数。

我的网络的输出是每个 class 形状 (b_size, rows, cols, num_classes) 的分割掩码堆栈。在这个输出中,我通过以下方式计算每个 class 的骰子系数:

def dice_metric(ground_truth, prediction):
    # initialize list with dice scores for each category
    dice_score_list = list()
    # get list of tensors with shape (rows, cols)
    ground_truth_unstacked = reshape_ground_truth(ground_truth)
    prediction_unstacked = tf.unstack(prediction, axis=-1)
    for (ground_truth_map, prediction_map) in zip(ground_truth_unstacked, prediction_unstacked):
        # calculate dice score for every class
        dice_i = dice_score(ground_truth_map, prediction_map)
        dice_score_list.append(dice_i)
    return tf.reduce_mean(dice_score_list, axis=[0])

有什么方法可以打印骰子得分列表而不是平均值。所以在每个时期的输出是:

Epoch 107/200
- 13s - loss: 0.8896 - dice_metric: [dice_class_1, ... dice_class_10] - val_loss: 3.3417 - val_dice_metric: [val_dice_class_1, ... val_dice_class_10]

关于 Custom Metrics 的 Keras 文档仅考虑单个张量值(即,“可以在编译步骤传递自定义指标。该函数需要采用 (y_true, y_pred) 作为参数和 return 一个 单个张量值。"

是否有任何 way/workaround 可以输出具有多个值的 metric

要让 keras 输出所有通道,每个通道需要一个指标。您可以创建一个包装器,它只采用索引和 returns 所需的 class:

#calculates dice considering an input with a single class
def dice_single(true,pred):
    true = K.batch_flatten(true)
    pred = K.batch_flatten(pred)
    pred = K.round(pred)

    intersection = K.sum(true * pred, axis=-1)
    true = K.sum(true, axis=-1)
    pred = K.sum(pred, axis=-1)

    return ((2*intersection) + K.epsilon()) / (true + pred + K.epsilon())

def dice_for_class(index):
    def dice_inner(true,pred):

        #get only the desired class
        true = true[:,:,:,index]
        pred = pred[:,:,:,index]

        #return dice per class
        return dice_single(true,pred)
    return dice_inner

那么您在模型中的指标将是 `metrics = [dice_for_class(i) for i in range(10)]


提示:除非绝对必要,否则不要重复。

没有迭代的十个 classes 的骰子示例

def dice_metric(ground_truth, prediction):

    #for metrics, it's good to round predictions:
    prediction = K.round(prediction)

    #intersection and totals per class per batch (considers channels last)
    intersection = ground_truth * prediction
    intersection = K.sum(intersection, axis=[1,2])
    ground_truth = K.sum(ground_truth, axis=[1,2])
    prediction = K.sum(prediciton, axis=[1,2])

    dice = ((2 * intersection) + K.epsilon()) / (ground_truth + prediction + K.epsilon())