Batch_size in tensorflow? Understanding the concept

14,787

The batch size is the amount of samples you feed in your network. For your input encoder you specify that you enter an unspecified(None) amount of samples with 41 values per sample.

The advantage of using None is that you can now train with batches of 100 values at once (which is good for your gradient), and test with a batch of only one value (one sample for which you want a prediction).

If you don't specify normalization per batch there is no normalization per batch ;)

Hope I explained it well enough! If you have more questions feel free to ask them!

Share:
14,787
WiLL_K
Author by

WiLL_K

Too lazy to make difference! The first thing that I wanna see every morning when I open my eyes is Memes. Its the food for my soul. Philosophy - Success comes to those who change with the changing time. Also When I get sad, I just stop being sad and be awesome instead

Updated on June 15, 2022

Comments

  • WiLL_K
    WiLL_K almost 2 years

    My question is simple and stright forward. What does a batch size specify while training and predicting a neural network. How to visualize it so as to get a clear picture of how data is being feed to the network.

    Suppose I have an autoencoder

    encoder = tflearn.input_data(shape=[None, 41])
    encoder = tflearn.fully_connected(encoder, 41,activation='relu')
    

    and I am taking an input as csv file with 41 features, So as to what I understand is it will take each feature from csv file and feed it to the 41 neurons of the first layer when my batch size is 1.

    But when I increase the batch size to 100 how is this 41 features of 100 batches are going to be feed to this network?

    model.fit(test_set, test_labels_set, n_epoch=1, validation_set=(valid_set, valid_labels_set),
              run_id="auto_encoder", batch_size=100,show_metric=True, snapshot_epoch=False)
    

    Will there be any normalization on the batches or some operations on them?

    The number of epocs are same for both the cases

  • WiLL_K
    WiLL_K about 7 years
    Thank You for the response, but My question is if I am taking batch size as 1 then the first row of 41 features will go into the NN where it will train and then backpropogate the weights for 1 epoch. As I increase the batch size to 100 will it take 41 features and extract the necessary information and it will do this for 100 batches and then backpropogate? M I right here?
  • rmeertens
    rmeertens about 7 years
    It will always run for the whole test_set you put into the network. Let's say you have 300 samples... The difference between a batch size of 1 and 100 is that in the first case he backpropagates 300 times, and in the second case he does this 3 times. The second one is faster and more precise.
  • hYk
    hYk almost 6 years
    Is there a usefulness in using batchsize ?
  • akzhere
    akzhere over 4 years
    @rmeertens you clarified my doubts over benefits of increasing the batch_size in terms of speed and precision but are there any advantages of keeping the batch_size to a lower value?