How to add and remove new layers in keras after loading weights? How to add and remove new layers in keras after loading weights? python python

How to add and remove new layers in keras after loading weights?


You can take the output of the last model and create a new model. The lower layers remains the same.

model.summary()model.layers.pop()model.layers.pop()model.summary()x = MaxPooling2D()(model.layers[-1].output)o = Activation('sigmoid', name='loss')(x)model2 = Model(input=in_img, output=[o])model2.summary()

Check How to use models from keras.applications for transfer learnig?

Update on Edit:

The new error is because you are trying to create the new model on global in_img which is actually not used in the previous model creation.. there you are actually defining a local in_img. So the global in_img is obviously not connected to the upper layers in the symbolic graph. And it has nothing to do with loading weights.

To better resolve this problem you should instead use model.input to reference to the input.

model3 = Model(input=model2.input, output=[o])


Another way to do it

from keras.models import Modellayer_name = 'relu_conv2'model2= Model(inputs=model1.input, outputs=model1.get_layer(layer_name).output)


As of Keras 2.3.1 and TensorFlow 2.0, model.layers.pop() is not working as intended (see issue here). They suggested two options to do this.

One option is to recreate the model and copy the layers. For instance, if you want to remove the last layer and add another one, you can do:

model = Sequential()for layer in source_model.layers[:-1]: # go through until last layer    model.add(layer)model.add(Dense(3, activation='softmax'))model.summary()model.compile(optimizer='adam', loss='categorical_crossentropy')

Another option is to use the functional model:

predictions = Dense(3, activation='softmax')(source_model.layers[-2].output)model = Model(inputs=inputs, outputs=predictions)model.compile(optimizer='adam', loss='categorical_crossentropy')

model.layers[-1].output means the last layer's output which is the final output, so in your code, you actually didn't remove any layers, you added another head/path.