Keras: Dice coefficient loss function is negative and increasing with epochs

13,184

Either 1-dice_coef or -dice_coef should make no difference for convergence but 1-dice_coef provides a more familiar way for monitoring since the values are in the range [0, 1], not [-1, 0].

Share:
13,184

Related videos on Youtube

Deba
Author by

Deba

Updated on June 04, 2022

Comments

  • Deba
    Deba almost 2 years

    According to this Keras implementation of Dice Co-eff loss function, the loss is minus of calculated value of dice coefficient. Loss should decrease with epochs but with this implementation I am , naturally, getting always negative loss and the loss getting decreased with epochs, i.e. shifting away from 0 toward the negative infinity side, instead of getting closer to 0. If I use (1- dice co-eff) instead of (-dice co-eff) as loss, will it be wrong? Here's the full Keras implementation (which I was talking about): https://github.com/jocicmarko/ultrasound-nerve-segmentation/blob/master/train.py

    smooth = 1.
    
    def dice_coef(y_true, y_pred):
        y_true_f = K.flatten(y_true)
        y_pred_f = K.flatten(y_pred)
        intersection = K.sum(y_true_f * y_pred_f)
        return (2. * intersection + smooth) / (K.sum(y_true_f) + K.sum(y_pred_f) + smooth)
    
    
    def dice_coef_loss(y_true, y_pred):
    return -dice_coef(y_true, y_pred)
    

    I have shared a log of my experiment with you, although only for 2 epochs:

    Train on 2001 samples, validate on 501 samples
    Epoch 1/2
    Epoch 00001: loss improved from inf to -0.73789, saving model to unet.hdf5
     - 3229s - loss: -7.3789e-01 - dice_coef: 0.7379 - val_loss: -7.9304e-01 - val_dice_coef: 0.7930
    Epoch 2/2
    Epoch 00002: loss improved from -0.73789 to -0.81037, saving model to unet.hdf5
     - 3077s - loss: -8.1037e-01 - dice_coef: 0.8104 - val_loss: -8.2842e-01 - val_dice_coef: 0.8284
    predict test data
    9/9 [==============================] - 4s 429ms/step
    dict_keys(['val_dice_coef', 'loss', 'val_loss', 'dice_coef'])