How to add regularizations in TensorFlow?
As you say in the second point, using the regularizer
argument is the recommended way. You can use it in get_variable
, or set it once in your variable_scope
and have all your variables regularized.
The losses are collected in the graph, and you need to manually add them to your cost function like this.
reg_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES) reg_constant = 0.01 # Choose an appropriate one. loss = my_normal_loss + reg_constant * sum(reg_losses)
Hope that helps!
A few aspects of the existing answer were not immediately clear to me, so here is a step-by-step guide:
Define a regularizer. This is where the regularization constant can be set, e.g.:
regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)
Create variables via:
weights = tf.get_variable( name="weights", regularizer=regularizer, ... )
Equivalently, variables can be created via the regular
weights = tf.Variable(...)
constructor, followed bytf.add_to_collection(tf.GraphKeys.REGULARIZATION_LOSSES, weights)
.Define some
loss
term and add the regularization term:reg_variables = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)reg_term = tf.contrib.layers.apply_regularization(regularizer, reg_variables)loss += reg_term
Note: It looks like
tf.contrib.layers.apply_regularization
is implemented as anAddN
, so more or less equivalent tosum(reg_variables)
.
I'll provide a simple correct answer since I didn't find one. You need two simple steps, the rest is done by tensorflow magic:
Add regularizers when creating variables or layers:
tf.layers.dense(x, kernel_regularizer=tf.contrib.layers.l2_regularizer(0.001))# ortf.get_variable('a', regularizer=tf.contrib.layers.l2_regularizer(0.001))
Add the regularization term when defining loss:
loss = ordinary_loss + tf.losses.get_regularization_loss()