How to add regularizations in TensorFlow? How to add regularizations in TensorFlow? python python

How to add regularizations in TensorFlow?


As you say in the second point, using the regularizer argument is the recommended way. You can use it in get_variable, or set it once in your variable_scope and have all your variables regularized.

The losses are collected in the graph, and you need to manually add them to your cost function like this.

  reg_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)  reg_constant = 0.01  # Choose an appropriate one.  loss = my_normal_loss + reg_constant * sum(reg_losses)

Hope that helps!


A few aspects of the existing answer were not immediately clear to me, so here is a step-by-step guide:

  1. Define a regularizer. This is where the regularization constant can be set, e.g.:

    regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)
  2. Create variables via:

        weights = tf.get_variable(        name="weights",        regularizer=regularizer,        ...    )

    Equivalently, variables can be created via the regular weights = tf.Variable(...) constructor, followed by tf.add_to_collection(tf.GraphKeys.REGULARIZATION_LOSSES, weights).

  3. Define some loss term and add the regularization term:

    reg_variables = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)reg_term = tf.contrib.layers.apply_regularization(regularizer, reg_variables)loss += reg_term

    Note: It looks like tf.contrib.layers.apply_regularization is implemented as an AddN, so more or less equivalent to sum(reg_variables).


I'll provide a simple correct answer since I didn't find one. You need two simple steps, the rest is done by tensorflow magic:

  1. Add regularizers when creating variables or layers:

    tf.layers.dense(x, kernel_regularizer=tf.contrib.layers.l2_regularizer(0.001))# ortf.get_variable('a', regularizer=tf.contrib.layers.l2_regularizer(0.001))
  2. Add the regularization term when defining loss:

    loss = ordinary_loss + tf.losses.get_regularization_loss()