What is right batch normalization function in Tensorflow?

17,668

Solution 1

Just to add to the list, there're several more ways to do batch-norm in tensorflow:

  • tf.nn.batch_normalization is a low-level op. The caller is responsible to handle mean and variance tensors themselves.
  • tf.nn.fused_batch_norm is another low-level op, similar to the previous one. The difference is that it's optimized for 4D input tensors, which is the usual case in convolutional neural networks. tf.nn.batch_normalization accepts tensors of any rank greater than 1.
  • tf.layers.batch_normalization is a high-level wrapper over the previous ops. The biggest difference is that it takes care of creating and managing the running mean and variance tensors, and calls a fast fused op when possible. Usually, this should be the default choice for you.
  • tf.contrib.layers.batch_norm is the early implementation of batch norm, before it's graduated to the core API (i.e., tf.layers). The use of it is not recommended because it may be dropped in the future releases.
  • tf.nn.batch_norm_with_global_normalization is another deprecated op. Currently, delegates the call to tf.nn.batch_normalization, but likely to be dropped in the future.
  • Finally, there's also Keras layer keras.layers.BatchNormalization, which in case of tensorflow backend invokes tf.nn.batch_normalization.

Solution 2

As show in doc, tf.contrib is a contribution module containing volatile or experimental code. When function is complete, it will be removed from this module. Now there are two, in order to be compatible with the historical version.

So, the former tf.layers.batch_normalization is recommended.

Share:
17,668

Related videos on Youtube

KimHee
Author by

KimHee

Updated on December 29, 2020

Comments

  • KimHee
    KimHee over 3 years

    In tensorflow 1.4, I found two functions that do batch normalization and they look same:

    1. tf.layers.batch_normalization (link)
    2. tf.contrib.layers.batch_norm (link)

    Which function should I use? Which one is more stable?

  • KimHee
    KimHee over 6 years
    Thanks. i accepted it. I just want to ask you about mean and variance. How can i manage mean and variance as you mentioned? just set is_training flag to False?
  • Maxim
    Maxim over 6 years
    By managing I mean the following: create the variable of the right shape and accumulate running mean/variance from the batches. It's a bit tedious, that's why it's easier to call the high-level function. In that case, you just need to set training attribute.
  • Robert Lugg
    Robert Lugg almost 6 years
    for future readers, this article nicely describes how to manually manage mean/variance: r2rt.com/implementing-batch-normalization-in-tensorflow.html