"Could not interpret optimizer identifier" error in Keras "Could not interpret optimizer identifier" error in Keras python-3.x python-3.x

"Could not interpret optimizer identifier" error in Keras


The reason is you are using tensorflow.python.keras API for model and layers and keras.optimizers for SGD. They are two different Keras versions of TensorFlow and pure Keras. They could not work together. You have to change everything to one version. Then it should work.


I am bit late here, Your issue is you have mixed Tensorflow keras and keras API in your code. The optimizer and the model should come from same layer definition. Use Keras API for everything as below:

from keras.models import Sequentialfrom keras.layers import Dense, Dropout, LSTM, BatchNormalizationfrom keras.callbacks import TensorBoardfrom keras.callbacks import ModelCheckpointfrom keras.optimizers import adam# Set Modelmodel = Sequential()model.add(LSTM(128, input_shape=(train_x.shape[1:]), return_sequences=True))model.add(Dropout(0.2))model.add(BatchNormalization())# Set Optimizeropt = adam(lr=0.001, decay=1e-6)# Compile modelmodel.compile(    loss='sparse_categorical_crossentropy',    optimizer=opt,    metrics=['accuracy'])

I have used adam in this example. Please use your relevant optimizer as per above code.

Hope this helps.


This problem is mainly caused due to different versions. The tensorflow.keras version may not be same as the keras.Thus causing the error as mentioned by @Priyanka.

For me, whenever this error arises, I pass in the name of the optimizer as a string, and the backend figures it out.For example instead of

tf.keras.optimizers.Adam

or

keras.optimizers.Adam

I do

model.compile(optimizer= 'adam' , loss= keras.losses.binary_crossentropy, metrics=['accuracy'])