Keras accuracy does not change Keras accuracy does not change python python

Keras accuracy does not change


The most likely reason is that the optimizer is not suited to your dataset. Here is a list of Keras optimizers from the documentation.

I recommend you first try SGD with default parameter values. If it still doesn't work, divide the learning rate by 10. Do that a few times if necessary. If your learning rate reaches 1e-6 and it still doesn't work, then you have another problem.

In summary, replace this line:

model.compile(loss = "categorical_crossentropy", optimizer = "adam")

with this:

from keras.optimizers import SGDopt = SGD(lr=0.01)model.compile(loss = "categorical_crossentropy", optimizer = opt)

and change the learning rate a few times if it doesn't work.

If it was the problem, you should see the loss getting lower after just a few epochs.


Another solution that I do not see mentioned here, but caused a similar problem for me was the activiation function of the last neuron, especialy if it is relu and not something non linear like sigmoid.

In other words, it might help you to use a non-linear activation function in the last layer

Last layer:

model.add(keras.layers.Dense(1, activation='relu'))

Output:

7996/7996 [==============================] - 1s 76us/sample - loss: 6.3474 - accuracy: 0.5860Epoch 2/307996/7996 [==============================] - 0s 58us/sample - loss: 6.3473 - accuracy: 0.5860Epoch 3/307996/7996 [==============================] - 0s 58us/sample - loss: 6.3473 - accuracy: 0.5860Epoch 4/307996/7996 [==============================] - 0s 57us/sample - loss: 6.3473 - accuracy: 0.5860Epoch 5/307996/7996 [==============================] - 0s 58us/sample - loss: 6.3473 - accuracy: 0.5860Epoch 6/307996/7996 [==============================] - 0s 60us/sample - loss: 6.3473 - accuracy: 0.5860Epoch 7/307996/7996 [==============================] - 0s 57us/sample - loss: 6.3473 - accuracy: 0.5860Epoch 8/307996/7996 [==============================] - 0s 57us/sample - loss: 6.3473 - accuracy: 0.5860

Now I used a non linear activation function:

model.add(keras.layers.Dense(1, activation='sigmoid'))

Output:

7996/7996 [==============================] - 1s 74us/sample - loss: 0.7663 - accuracy: 0.5899Epoch 2/307996/7996 [==============================] - 0s 59us/sample - loss: 0.6243 - accuracy: 0.5860Epoch 3/307996/7996 [==============================] - 0s 56us/sample - loss: 0.5399 - accuracy: 0.7580Epoch 4/307996/7996 [==============================] - 0s 56us/sample - loss: 0.4694 - accuracy: 0.7905Epoch 5/307996/7996 [==============================] - 0s 57us/sample - loss: 0.4363 - accuracy: 0.8040Epoch 6/307996/7996 [==============================] - 0s 60us/sample - loss: 0.4139 - accuracy: 0.8099Epoch 7/307996/7996 [==============================] - 0s 58us/sample - loss: 0.3967 - accuracy: 0.8228Epoch 8/307996/7996 [==============================] - 0s 61us/sample - loss: 0.3826 - accuracy: 0.8260

This is not directly a solution to the original answer, but as the answer is #1 on Google when searching for this problem, it might benefit someone.


If the accuracy is not changing, it means the optimizer has found a local minimum for the loss. This may be an undesirable minimum. One common local minimum is to always predict the class with the most number of data points. You should use weighting on the classes to avoid this minimum.

from sklearn.utils import compute_class_weightclassWeight = compute_class_weight('balanced', outputLabels, outputs) classWeight = dict(enumerate(classWeight))model.fit(X_train, y_train, batch_size = batch_size, nb_epoch = nb_epochs, show_accuracy = True, verbose = 2, validation_data = (X_test, y_test), class_weight=classWeight)