"Could not interpret optimizer identifier" error in Keras

90,293

Solution 1

The reason is you are using tensorflow.python.keras API for model and layers and keras.optimizers for SGD. They are two different Keras versions of TensorFlow and pure Keras. They could not work together. You have to change everything to one version. Then it should work.

Solution 2

I am bit late here, Your issue is you have mixed Tensorflow keras and keras API in your code. The optimizer and the model should come from same layer definition. Use Keras API for everything as below:

from keras.models import Sequential
from keras.layers import Dense, Dropout, LSTM, BatchNormalization
from keras.callbacks import TensorBoard
from keras.callbacks import ModelCheckpoint
from keras.optimizers import adam

# Set Model
model = Sequential()
model.add(LSTM(128, input_shape=(train_x.shape[1:]), return_sequences=True))
model.add(Dropout(0.2))
model.add(BatchNormalization())

# Set Optimizer
opt = adam(lr=0.001, decay=1e-6)

# Compile model
model.compile(
    loss='sparse_categorical_crossentropy',
    optimizer=opt,
    metrics=['accuracy']
)

I have used adam in this example. Please use your relevant optimizer as per above code.

Hope this helps.

Solution 3

This problem is mainly caused due to different versions. The tensorflow.keras version may not be same as the keras. Thus causing the error as mentioned by @Priyanka.

For me, whenever this error arises, I pass in the name of the optimizer as a string, and the backend figures it out. For example instead of

tf.keras.optimizers.Adam

or

keras.optimizers.Adam

I do

model.compile(optimizer= 'adam' , loss= keras.losses.binary_crossentropy, metrics=['accuracy'])

Solution 4

from tensorflow.keras.optimizers import SGD

This works well.

Since Tensorflow 2.0, there is a new API available directly via tensorflow:

Solution works for tensorflow==2.2.0rc2, Keras==2.2.4 (on Win10)

Please also note that the version above uses learning_rate as parameter and no longer lr.

Solution 5

For some libraries (e.g. keras_radam) you'll need to set up an environment variable before the import:

import os
os.environ['TF_KERAS'] = '1'

import tensorflow
import your_library
Share:
90,293
Admin
Author by

Admin

Updated on July 09, 2022

Comments

  • Admin
    Admin almost 2 years

    I got this error when I tried to modify the learning rate parameter of SGD optimizer in Keras. Did I miss something in my codes or my Keras was not installed properly?

    Here is my code:

    from tensorflow.python.keras.models import Sequential
    from tensorflow.python.keras.layers import Dense, Flatten, GlobalAveragePooling2D, Activation
    import keras
    from keras.optimizers import SGD
    
    model = Sequential()
    model.add(Dense(64, kernel_initializer='uniform', input_shape=(10,)))
    model.add(Activation('softmax'))
    model.compile(loss='mean_squared_error', optimizer=SGD(lr=0.01), metrics= ['accuracy'])*
    

    and here is the error message:

    Traceback (most recent call last): File "C:\TensorFlow\Keras\ResNet-50\test_sgd.py", line 10, in model.compile(loss='mean_squared_error', optimizer=SGD(lr=0.01), metrics=['accuracy']) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\models.py", line 787, in compile **kwargs) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\engine\training.py", line 632, in compile self.optimizer = optimizers.get(optimizer) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\optimizers.py", line 788, in get raise ValueError('Could not interpret optimizer identifier:', identifier) ValueError: ('Could not interpret optimizer identifier:', )

  • anothernode
    anothernode almost 6 years
    Welcome to Stack Overflow! Could you add a little bit of an explanation about why you think this would solve the problem stated in the question?
  • double-beep
    double-beep over 4 years
    Welcome to Stack Overflow! While this code may solve the question, including an explanation of how and why this solves the problem would really help to improve the quality of your post, and probably result in more up-votes. Remember that you are answering the question for readers in the future, not just the person asking now. Please edit your answer to add explanations and give an indication of what limitations and assumptions apply.
  • High Performance Rangsiman
    High Performance Rangsiman over 3 years
    Yes, you can pass a string name of the optimizer as the value of optimizer argument but using tf.keras.optimizers.Adam function is more flexible when you want to adjust optimizer setting for example learning rate.
  • Ayan Mitra
    Ayan Mitra over 3 years
    this doesnt work, you should give a working solution
  • EMT
    EMT about 3 years
    Just to add, in current TF version (2.4.1), optimizers have to be called as a function, not a parameter. So the exact code will be "tf.keras.optimizers.Adam()"
  • actnmk
    actnmk almost 3 years
    then how can I add lr with this syntax? i tried below but it did not work model.compile(optimizer= 'adam'(lr=0.0001); loss= keras.losses.binary_crossentropy, metrics=['accuracy'])
  • PeJota
    PeJota over 2 years
    Alternatively, if you'd like to use tensorflow.keras instead of keras, try the example at the following link