Tensorflow 2.0: Optimizer.minimize ('Adam' object has no attribute 'minimize')
17,589
Actually there is a difference. If you print both classes, you'll see:
from tensorflow.python.keras.optimizers import Adam
print(Adam)
print(tf.optimizers.Adam)
<class 'tensorflow.python.keras.optimizers.Adam'>
<class 'tensorflow.python.keras.optimizer_v2.adam.Adam'>
So in the first case Adam inherits from some other class. It's meant to be used inside Keras training loop, therefore, it doesn't have minimize method. To make sure, let's get all class methods
import inspect
from tensorflow.python.keras.optimizers import Adam
print(inspect.getmembers(Adam(), predicate=inspect.ismethod))
Output shows that this class doesn't even have minimize
Author by
ikamen
I am enthusiastic about making code obvious using python. All expressed opinions are my own.
Updated on June 25, 2022Comments
-
ikamen almost 2 years
For my Reinforcement Learning application, I need to be able to apply custom gradients / minimize changing loss function. According to documentation, it should be possible with Optimizer.minimize() function. However, my pip-installed version appears not to have this feature at all.
My code:
from tensorflow.python.keras.optimizers import Adam, SGD print(tf.version.VERSION) optim = Adam() optim.minimize(loss, var_list=network.weights)
output:
2.0.0-alpha0 Traceback (most recent call last): File "/Users/ikkamens/Library/Preferences/PyCharmCE2018.3/scratches/testo.py", line 18, in <module> optim.minimize(loss, var_list=network.weights) AttributeError: 'Adam' object has no attribute 'minimize'