How to use advanced activation layers in Keras? How to use advanced activation layers in Keras? python python

How to use advanced activation layers in Keras?


The correct way to use the advanced activations like PReLU is to use it with add() method and not wrapping it using Activation class. Example:

model = Sequential()act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)model.add(Dense(64, input_dim=14, init='uniform'))model.add(act)


If using the Model API in Keras you can call directly the function inside the Keras Layer. Here's an example:

from keras.models import Modelfrom keras.layers import Dense, Input# using prelu?from keras.layers.advanced_activations import PReLU# Model definition# encoderinp = Input(shape=(16,))lay = Dense(64, kernel_initializer='uniform',activation=PReLU(),            name='encoder')(inp)#decoderout = Dense(2,kernel_initializer='uniform',activation=PReLU(),             name='decoder')(lay)# build the modelmodel = Model(inputs=inp,outputs=out,name='cae')


For Keras functional API I think the correct way to combine Dense and PRelu (or any other advanced activation) is to use it like this:

focus_tns =focus_lr(enc_bidi_tns)enc_dense_lr = k.layers.Dense(units=int(hidden_size))enc_dense_tns = k.layers.PReLU()(enc_dense_lr(focus_tns))dropout_lr = k.layers.Dropout(0.2)dropout_tns = dropout_lr(enc_dense_tns)enc_dense_lr2 = k.layers.Dense(units=int(hidden_size/4))enc_dense_tns2 = k.layers.PReLU()(enc_dense_lr2(dropout_tns)) 

of course one should parametrize layers according to the problem