How do you create a custom activation function with Keras? How do you create a custom activation function with Keras? python python

How do you create a custom activation function with Keras?


Credits to this Github issue comment by Ritchie Ng.

# Creating a modelfrom keras.models import Sequentialfrom keras.layers import Dense# Custom activation functionfrom keras.layers import Activationfrom keras import backend as Kfrom keras.utils.generic_utils import get_custom_objectsdef custom_activation(x):    return (K.sigmoid(x) * 5) - 1get_custom_objects().update({'custom_activation': Activation(custom_activation)})# Usagemodel = Sequential()model.add(Dense(32, input_dim=784))model.add(Activation(custom_activation, name='SpecialActivation'))print(model.summary())

Please keep in mind that you have to import this function when you save and restore the model. See the note of keras-contrib.


Slightly simpler than Martin Thoma's answer: you can just create a custom element-wise back-end function and use it as a parameter. You still need to import this function before loading your model.

from keras import backend as Kdef custom_activation(x):    return (K.sigmoid(x) * 5) - 1model.add(Dense(32 , activation=custom_activation))


Let's say you would like to add swish or gelu to keras, the previous methods are nice inline insertions. But you could also insert them in the set of keras activation functions, so that you call you custom fucntion as you would call ReLU. I tested this with keras 2.2.2 (any v2 would do). Append to this file $HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py the definition of your custom function (can be different for you python and anaconda version).

In keras internal:

$HOME/anaconda2/lib/python2.7/site-packages/keras/activations.pydef swish(x):    return (K.sigmoid(beta * x) * alpha *x)

Then in your python file:

$HOME/Documents/neural_nets.pymodel = Sequential()model.add(Activation('swish'))