TensorFlow 2.0 dataset.__iter__() is only supported when eager execution is enabled

23,063

Solution 1

I fixed this by changing the train function to the following:

def train(model, dataset, optimizer):
    for step, (x1, x2, y) in enumerate(dataset):
        with tf.GradientTape() as tape:
            left, right = model([x1, x2])
            loss = contrastive_loss(left, right, tf.cast(y, tf.float32))
        gradients = tape.gradient(loss, model.trainable_variables)
        optimizer.apply_gradients(zip(gradients, model.trainable_variables))

The two changes are removing the @tf.function and fixing the enumeration.

Solution 2

I fixed it by enabling eager execution after importing tensorflow:

import tensorflow as tf

tf.enable_eager_execution()

Reference: Tensorflow

Solution 3

In case you are using Jupyter notebook after

import tensorflow as tf

tf.enable_eager_execution()

You need to restart the kernel and it works

Share:
23,063

Related videos on Youtube

Steven Hickson
Author by

Steven Hickson

Updated on September 09, 2020

Comments

  • Steven Hickson
    Steven Hickson over 3 years

    I'm using the following custom training code in TensorFlow 2:

    def parse_function(filename, filename2):
        image = read_image(fn)
        def ret1(): return image, read_image(fn2), 0
        def ret2(): return image, preprocess(image), 1
        return tf.case({tf.less(tf.random.uniform([1])[0], tf.constant(0.5)): ret2}, default=ret1)
    
    dataset = tf.data.Dataset.from_tensor_slices((train,shuffled_train))
    dataset = dataset.shuffle(len(train))
    dataset = dataset.map(parse_function, num_parallel_calls=4)
    dataset = dataset.batch(1)
    dataset = dataset.prefetch(buffer_size=4)
    
    @tf.function
    def train(model, dataset, optimizer):
        for x1, x2, y in enumerate(dataset):
            with tf.GradientTape() as tape:
                left, right = model([x1, x2])
                loss = contrastive_loss(left, right, tf.cast(y, tf.float32))
            gradients = tape.gradient(loss, model.trainable_variables)
            optimizer.apply_gradients(zip(gradients, model.trainable_variables))
    
    siamese_net.compile(optimizer=tf.keras.optimizers.RMSprop(learning_rate=1e-3))
    train(siamese_net, dataset, tf.keras.optimizers.RMSprop(learning_rate=1e-3))
    

    This code gives me the error:

    dataset.__iter__() is only supported when eager execution is enabled.
    

    However, it's in TensorFlow 2.0 so eager is enabled by default. tf.executing_eagerly() also returns 'True'.

    • Sharky
      Sharky about 5 years
      I think you're using the wrong order in this line for x1, x2, y in enumerate(dataset): enumerate iterator comes first, so in your case it should be y, x1, x2, left, right = model([x1, x2])
    • Steven Hickson
      Steven Hickson about 5 years
      I'm not sure I understand your change. x1, x2, and y are two images and a label returned by the dataset. I used this as a reference: tensorflow.org/alpha/guide/keras/… I also added the parse_function
    • Sharky
      Sharky about 5 years
      insert print(x1) right after for x1, x2, y in enumerate(dataset): You'll get 0 instead of actual value from dataset. In this case x1 is not value, it's an enumerate counter
    • Steven Hickson
      Steven Hickson about 5 years
      Okay two problems here. You are correct with the enumeration. It needs to be: for step, (x1, x2, y) in enumerate(dataset). Secondly, I have to remove the line @tf.function for some reason. I'm not sure why this can't be here since it's used a lot in the documentation examples I found but in this case it breaks the dataset iteration. It doesn't work at all with this line and just throws that error.
  • DecentGradient
    DecentGradient about 5 years
    Perhaps @tf.function would work without enumerating the dataset
  • Anshuman Kumar
    Anshuman Kumar over 4 years
    Note that you should enable it at the beginning of the program
  • Steven Hickson
    Steven Hickson almost 4 years
    As I mentioned in the post, this code was already executing eagerly as it was TF2 and tf.executing_eagerly() returned True. The documentation link you posted confirms this. It's possible this would help with another version of tensorflow though. My fix below ended up working for me.
  • 404pio
    404pio over 3 years
    @emrepun but what if I want to iterate over dataset not in eager mode?
  • emrepun
    emrepun over 3 years
    I believe the error indicates that such a thing is not possible without the eager mode. Meaning, we cannot iterate with __iter__() but perhaps there is another solution when the eager mode is disabled. Which unfortunately I don't know how. @404pio