Keras + TensorFlow Realtime training chart

12,964

Solution 1

There is livelossplot Python package for live training loss plots in Jupyter Notebook for Keras (disclaimer: I am the author).

from livelossplot import PlotLossesKeras

model.fit(X_train, Y_train,
          epochs=10,
          validation_data=(X_test, Y_test),
          callbacks=[PlotLossesKeras()],
          verbose=0)

To see how does it work, look at its source, especially this file: https://github.com/stared/livelossplot/blob/master/livelossplot/outputs/matplotlib_plot.py (from IPython.display import clear_output and clear_output(wait=True)).

A fair disclaimer: it does interfere with Keras output.

enter image description here

Solution 2

Keras comes with a callback for TensorBoard.

You can easily add this behaviour to your model and then just run tensorboard on top of the logging data.

callbacks = [TensorBoard(log_dir='./logs')]
result = model.fit(X, Y, ..., callbacks=callbacks)

And then on your shell:

tensorboard --logdir=/logs

If you need it in your notebook, you can also write your own callback to get metrics while training:

 class LogCallback(Callback):

    def on_epoch_end(self, epoch, logs=None):
        print(logs["train_accuracy"])

This would get the training accuracy at the end of the current epoch and print it. There's some good documentation around it on the official keras site.

Share:
12,964

Related videos on Youtube

Shlomi Schwartz
Author by

Shlomi Schwartz

Updated on September 15, 2022

Comments

  • Shlomi Schwartz
    Shlomi Schwartz over 1 year

    I have the following code running inside a Jupyter notebook:

    # Visualize training history
    from keras.models import Sequential
    from keras.layers import Dense
    import matplotlib.pyplot as plt
    import numpy
    # fix random seed for reproducibility
    seed = 7
    numpy.random.seed(seed)
    # load pima indians dataset
    dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
    # split into input (X) and output (Y) variables
    X = dataset[:,0:8]
    Y = dataset[:,8]
    # create model
    model = Sequential()
    model.add(Dense(12, input_dim=8, kernel_initializer='uniform', activation='relu'))
    model.add(Dense(8, kernel_initializer='uniform', activation='relu'))
    model.add(Dense(1, kernel_initializer='uniform', activation='sigmoid'))
    # Compile model
    model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
    # Fit the model
    history = model.fit(X, Y, validation_split=0.33, epochs=150, batch_size=10, verbose=0)
    # list all data in history
    print(history.history.keys())
    # summarize history for accuracy
    plt.plot(history.history['acc'])
    plt.plot(history.history['val_acc'])
    plt.title('model accuracy')
    plt.ylabel('accuracy')
    plt.xlabel('epoch')
    plt.legend(['train', 'test'], loc='upper left')
    plt.show()
    # summarize history for loss
    plt.plot(history.history['loss'])
    plt.plot(history.history['val_loss'])
    plt.title('model loss')
    plt.ylabel('loss')
    plt.xlabel('epoch')
    plt.legend(['train', 'test'], loc='upper left')
    plt.show()
    

    The code collects epochs history, then displays the progress history.


    Q: How can I make the chart change while training so I can see the changes in real time?

  • Shamoon
    Shamoon about 5 years
    Does this only with for Jupyter notebooks?
  • Lucas Farias
    Lucas Farias about 5 years
    But does TensorBoard show you live trainning data or can you only see it after training the model?
  • Dinesh
    Dinesh almost 4 years
    I had a question after using the library, livelossplot. I observed the following metrics: validation (min: 113.068, max: 1852.994, cur: 117.239). I clearly get the meaning of 'min' and 'max', but what does 'cur' actually mean ?
  • Piotr Migdal
    Piotr Migdal almost 4 years
    Now it is not only for Jupyter.
  • glicerico
    glicerico over 3 years
    @LucasFarias you can see it real time with Tensorboard
  • jtlz2
    jtlz2 about 3 years
    @Dinesh "Current", surely?