How to understand loss acc val_loss val_acc in Keras model fitting

45,733

Solution 1

Answering your questions:

  1. As described on official keras FAQ

the training loss is the average of the losses over each batch of training data. Because your model is changing over time, the loss over the first batches of an epoch is generally higher than over the last batches. On the other hand, the testing loss for an epoch is computed using the model as it is at the end of the epoch, resulting in a lower loss.

  1. Training should be stopped when val_acc stops increasing, otherwise your model will probably overffit. You can use earlystopping callback to stop training.

  2. Your model seems to achieve very good results. Keep up the good work.

Solution 2

  1. What are loss and val_loss?

In deep learning, the loss is the value that a neural network is trying to minimize: it's the distance between the ground truth and the predictions. In order to minimize this distance, the neural network learns by adjusting weights and biases in a manner that reduces the loss.

For instance, in regression tasks, you have a continuous target, e.g., height. What you want to minimize is the difference between your predictions, and the actual height. You can use mean_absolute_error as loss so the neural network knows this is what it needs to minimize.

In classification, it's a little more complicated, but very similar. Predicted classes are based on probability. The loss is therefore also based on probability. In classification, the neural network minimizes the likelihood to assign a low probability to the actual class. The loss is typically categorical_crossentropy.

loss and val_loss differ because the former is applied to the train set, and the latter the test set. As such, the latter is a good indication of how the model performs on unseen data. You can get a validation set by using validation_data=[x_test, y_test] or validation_split=0.2.

It's best to rely on the val_loss to prevent overfitting. Overfitting is when the model fits the training data too closely, and the loss keeps decreasing while the val_loss is stale, or increases.

In Keras, you can use EarlyStopping to stop training when the val_loss stops decreasing. Read here.

Read more about deep learning losses here: Loss and Loss Functions for Training Deep Learning Neural Networks.

  1. What are acc and val_acc?

Accuracy is a metric only for classification. It makes no sense on a task with a continuous target. It gives the percentage of instances that are correctly classified.

Once again, acc is on the training data, and val_acc is on the validation data. It's best to rely on val_acc for a fair representation of model performance because a good neural network will end up fitting the training data at 100%, but would perform poorly on unseen data.

Share:
45,733
Rocco
Author by

Rocco

Updated on March 20, 2021

Comments

  • Rocco
    Rocco about 3 years

    I'm new on Keras and have some questions on how to understanding my model results. Here is my result:(for your convenience, I only paste the loss acc val_loss val_acc after each epoch here)

    Train on 4160 samples, validate on 1040 samples as below:

    Epoch 1/20
    4160/4160 - loss: 3.3455 - acc: 0.1560 - val_loss: 1.6047 - val_acc: 0.4721
    
    Epoch 2/20
    4160/4160 - loss: 1.7639 - acc: 0.4274 - val_loss: 0.7060 - val_acc: 0.8019
    
    Epoch 3/20
    4160/4160 - loss: 1.0887 - acc: 0.5978 - val_loss: 0.3707 - val_acc: 0.9087
    
    Epoch 4/20
    4160/4160 - loss: 0.7736 - acc: 0.7067 - val_loss: 0.2619 - val_acc: 0.9442
    
    Epoch 5/20
    4160/4160 - loss: 0.5784 - acc: 0.7690 - val_loss: 0.2058 - val_acc: 0.9433
    
    Epoch 6/20
    4160/4160 - loss: 0.5000 - acc: 0.8065 - val_loss: 0.1557 - val_acc: 0.9750
    
    Epoch 7/20
    4160/4160 - loss: 0.4179 - acc: 0.8296 - val_loss: 0.1523 - val_acc: 0.9606
    
    Epoch 8/20
    4160/4160 - loss: 0.3758 - acc: 0.8495 - val_loss: 0.1063 - val_acc: 0.9712
    
    Epoch 9/20
    4160/4160 - loss: 0.3202 - acc: 0.8740 - val_loss: 0.1019 - val_acc: 0.9798
    
    Epoch 10/20
    4160/4160 - loss: 0.3028 - acc: 0.8788 - val_loss: 0.1074 - val_acc: 0.9644
    
    Epoch 11/20
    4160/4160 - loss: 0.2696 - acc: 0.8923 - val_loss: 0.0581 - val_acc: 0.9856
    
    Epoch 12/20
    4160/4160 - loss: 0.2738 - acc: 0.8894 - val_loss: 0.0713 - val_acc: 0.9837
    
    Epoch 13/20
    4160/4160 - loss: 0.2609 - acc: 0.8913 - val_loss: 0.0679 - val_acc: 0.9740
    
    Epoch 14/20
    4160/4160 - loss: 0.2556 - acc: 0.9022 - val_loss: 0.0599 - val_acc: 0.9769
    
    Epoch 15/20
    4160/4160 - loss: 0.2384 - acc: 0.9053 - val_loss: 0.0560 - val_acc: 0.9846
    
    Epoch 16/20
    4160/4160 - loss: 0.2305 - acc: 0.9079 - val_loss: 0.0502 - val_acc: 0.9865
    
    Epoch 17/20
    4160/4160 - loss: 0.2145 - acc: 0.9185 - val_loss: 0.0461 - val_acc: 0.9913
    
    Epoch 18/20
    4160/4160 - loss: 0.2046 - acc: 0.9183 - val_loss: 0.0524 - val_acc: 0.9750
    
    Epoch 19/20
    4160/4160 - loss: 0.2055 - acc: 0.9120 - val_loss: 0.0440 - val_acc: 0.9885
    
    Epoch 20/20
    4160/4160 - loss: 0.1890 - acc: 0.9236 - val_loss: 0.0501 - val_acc: 0.9827
    

    Here are my understandings:

    1. The two losses (both loss and val_loss) are decreasing and the tow acc (acc and val_acc) are increasing. So this indicates the modeling is trained in a good way.

    2. The val_acc is the measure of how good the predictions of your model are. So for my case, it looks like the model was trained pretty well after 6 epochs, and the rest training is not necessary.

    My Questions are:

    1. The acc (the acc on training set) is always smaller, actually much smaller, than val_acc. Is this normal? Why this happens?In my mind, acc should usually similar to better than val_acc.

    2. After 20 epochs, the acc is still increasing. So should I use more epochs and stop when acc stops increasing? Or I should stop where val_acc stops increasing, regardless of the trends of acc?

    3. Is there any other thoughts on my results?

    Thanks!

  • SuperHanz98
    SuperHanz98 over 4 years
    So I understand what loss is, but what actually does the number represent. Is it a percentage of something? A loss of 0.5 for example, what does that actually mean? How does loss relate to that number 0.5?