Custom loss function in Keras Custom loss function in Keras python python

Custom loss function in Keras


All you have to do is define a function for that, using keras backend functions for calculations. The function must take the true values and the model predicted values.

Now, since I'm not sure about what are g, q, x an y in your function, I'll just create a basic example here without caring about what it means or whether it's an actual useful function:

import keras.backend as Kdef customLoss(yTrue,yPred):    return K.sum(K.log(yTrue) - K.log(yPred))    

All backend functions can be seen here.

After that, compile your model using that function instead of a regular one:

model.compile(loss=customLoss, optimizer = .....)


Since Keras is not multi-backend anymore (source), operations for custom losses should be made directly in Tensorflow, rather than using the backend.

You can make a custom loss with Tensorflow by making a function that takes y_true and y_pred as arguments, as suggested in the documentation:

import tensorflow as tfx = tf.random.uniform(minval=0, maxval=1, shape=(10, 1), dtype=tf.float32)y = tf.random.uniform(minval=0, maxval=1, shape=(10, 1), dtype=tf.float32)def custom_mse(y_true, y_pred):    squared_difference = tf.square(y_true - y_pred)    return tf.reduce_mean(squared_difference, axis=-1)custom_mse(x, y)
<tf.Tensor: shape=(10,), dtype=float32, numpy=array([0.30084264, 0.03535452, 0.10345092, 0.28552982, 0.02426687,       0.04410492, 0.01701574, 0.55496216, 0.74927425, 0.05747304],      dtype=float32)>

Then you can set your custom loss in model.compile(). Here's a complete example:

x = tf.random.uniform(minval=0, maxval=1, shape=(1000, 4), dtype=tf.float32)y = tf.multiply(tf.reduce_sum(x, axis=-1), 5) # y is a function of xmodel = tf.keras.Sequential([    tf.keras.layers.Dense(16, input_shape=[4], activation='relu'),    tf.keras.layers.Dense(32, activation='relu'),    tf.keras.layers.Dense(1)])model.compile(loss=custom_mse, optimizer='adam')history = model.fit(x, y, epochs=10)
Train on 1000 samplesEpoch 1/5  32/1000 [..............................] - ETA: 10s - loss: 99.54021000/1000 [==============================] - 0s 371us/sample - loss: 105.6800Epoch 2/5  32/1000 [..............................] - ETA: 0s - loss: 89.29091000/1000 [==============================] - 0s 35us/sample - loss: 98.8208Epoch 3/5  32/1000 [..............................] - ETA: 0s - loss: 86.43391000/1000 [==============================] - 0s 34us/sample - loss: 82.7988Epoch 4/5  32/1000 [..............................] - ETA: 0s - loss: 75.25801000/1000 [==============================] - 0s 33us/sample - loss: 52.4585Epoch 5/5  32/1000 [..............................] - ETA: 0s - loss: 28.16251000/1000 [==============================] - 0s 34us/sample - loss: 17.8190