What does nb_epoch in neural network stands for?

11,115

Solution 1

Starting Keras 2.0, nb_epoch argument has been renamed to epochs everywhere.

Neural networks are trained iteratively, making multiple passes over entire dataset. Each pass over entire dataset is referred to as epoch.

There are two possible ways to choose an optimum number of epochs:

1) Set epochs to a large number, and stop training when validation accuracy or loss stop improving: so-called early stopping

from keras.callbacks import EarlyStopping
early_stopping = EarlyStopping(monitor='val_loss', patience=4, mode='auto')

model.fit(X_train, Y_train,
      batch_size=128, epochs=500,
      show_accuracy=True, verbose=1,
      validation_data=(X_test, Y_test),callbacks = [early_stopping])

2) Consider number of epochs as a hyperparameter and select the best value based on a set of trials (runs) on a grid of epochs values

Solution 2

it seems you might be using old version of keras ,nb_epoch refers to number of epochs which has been replaced by epoch

if you look here you will see that it has been deprecated.

One epoch means that you have trained all dataset(all records) once,if you have 384 records,one epoch means that you have trained your model for all on all 384 records. Batch size means the data you model uses on single iteration,in that case,128 batch size means that at once,your model takes 128 and do some a single forward pass and backward pass(backpropation)[This is called one iteration] .it To break it down with this example,one iteration,your model takes 128 records[1st batch] from your whole 384 to be trained and do a forward pass and backward pass(back propagation). on second batch,it takes from 129 to 256 records and do another iteration. then 3rd batch,from 256 to 384 and performs the 3rd iteration. In this case,we say that it has completed one epoch. the number of epoch tells the model the number it has to repeat all those processes above then stops.

There is no correct way to choose a number of epoch,its something that is done by experimenting,usually when the model stops to learn(loss is not going down anymore) you usually decrease the learning rate,if it doesn't go down after that and the results seems to be more or less as you expected then you select at that epoch where the model stopped to learn

I hope it helps

Solution 3

In neural networks, an epoch is equivalent to training the network using each data once.

The number of epochs, nb_epoch, is hence how many times you re-use your data during training.

Share:
11,115
Hajji Sofien
Author by

Hajji Sofien

Updated on June 08, 2022

Comments

  • Hajji Sofien
    Hajji Sofien almost 2 years

    i'm currently beginning to discover Keras library for deap learning, it seems that in the training phase a centain number of epoch is chosen, but i don't know on which assumption is this choice based on.

    In the Mnist dataset the number of epochs chosen is 4 :

    model.fit(X_train, Y_train,
              batch_size=128, nb_epoch=4,
              show_accuracy=True, verbose=1,
              validation_data=(X_test, Y_test))
    

    Could someone tell me why and how do we choose a correct number of epochs ?