Reproducible results in Tensorflow with tf.set_random_seed Reproducible results in Tensorflow with tf.set_random_seed python python

Reproducible results in Tensorflow with tf.set_random_seed


In tensorflow, a random operation relies on two different seeds: a global seed, set by tf.set_random_seed, and an operation seed, provided as an argument to the operation. You will find more details on how they relate in the docs.

You have a different seed for each random op because each random op maintains its own internal state for pseudo-random number generation. The reason for having each random generator maintaining its own state is to be robust to change: if they shared the same state, then adding a new random generator somewhere in your graph would change the values produced by all the other generators, defeating the purpose of using a seed.

Now, why do we have this dual system of global and per-op seeds? Well, actually the global seed is not necessary. It is there for convenience: It allows to set all random op seeds to a different and deterministic (if unknown) value at once, without having to go exhaustively through all of them.

Now when a global seed is set but not the op seed, according to the docs,

The system deterministically picks an operation seed in conjunction with the graph-level seed so that it gets a unique random sequence.

To be more precise, the seed that is provided is the id of the last operation that has been created in the current graph. Consequently, globally-seeded random operation are extremely sensitive to change in the graph, in particular to those created before itself.

For example,

import tensorflow as tftf.set_random_seed(1234)generate = tf.random_uniform(())with tf.Session() as sess:  print(generate.eval())  # 0.96046877

Now if we create a node before, the result changes:

import tensorflow as tftf.set_random_seed(1234)tf.zeros(()) # new op added before generate = tf.random_uniform(())with tf.Session() as sess:  print(generate.eval())  # 0.29252338

If a node is create after however, it does not affect the op seed:

import tensorflow as tftf.set_random_seed(1234)generate = tf.random_uniform(())tf.zeros(()) # new op added afterwith tf.Session() as sess:  print(generate.eval())  # 0.96046877

Obviously, as in your case, if you generate several operations, they will have different seeds:

import tensorflow as tftf.set_random_seed(1234)gen1 = tf.random_uniform(())gen2 = tf.random_uniform(())with tf.Session() as sess:  print(gen1.eval())  print(gen2.eval())  # 0.96046877  # 0.85591054

As a curiosity, and to validate the fact that seeds are simply the last used id in the graph, you could align the seed of gen2 to gen1 with

import tensorflow as tftf.set_random_seed(1234)gen1 = tf.random_uniform(())# 4 operations seems to be created after seed has been pickedseed = tf.get_default_graph()._last_id - 4gen2 = tf.random_uniform((), seed=seed)with tf.Session() as sess:  print(gen1.eval())  print(gen2.eval())  # 0.96046877  # 0.96046877

Obviously though, this should not pass code review.


Late to the party, however the random number generator has been overhauled (see https://github.com/tensorflow/community/pull/38 for summarizing the process) and the tf.random.experimental.Generator class now provides the desired functionality.

From TF 1.14 onwards (incl. TF 2.0), you can seed the generator and obtain the exact same random number regardless of session, platform, or even architecture.

import tensorflow as tfrng = tf.random.experimental.Generator.from_seed(1234)rng.uniform((), 5, 10, tf.int64)  # draw a random scalar (0-D tensor) between 5 and 10

See the documentation for details:

To address your particular question (I'm using TF 2.0):

for i in range(3):  b = tf.random.uniform((10,), 0, 10, seed=1234)  print(b)

gives

tf.Tensor([2.7339518  9.339194   5.2865124  8.912003   8.402512   0.53086996 4.385383   4.8005686  2.2077608  2.1795273 ], shape=(10,), dtype=float32)tf.Tensor([9.668942   3.4503186  7.4577675  2.9200733  1.8064988  6.1576104 3.9958012  1.889689   3.8289428  0.36031008], shape=(10,), dtype=float32)tf.Tensor([8.019657  4.895439  5.90925   2.418766  4.524292  7.901089  9.702316 5.1606855 9.744821  2.4418736], shape=(10,), dtype=float32)

while this

for i in range(3):  rng = tf.random.experimental.Generator.from_seed(1234)  b = rng.uniform((10,), 0, 10)  print(b)

gives what you want:

tf.Tensor([3.581475  1.132276  5.6670904 6.712369  3.2565057 1.7095459 8.468903 6.2697005 1.0973608 2.7732193], shape=(10,), dtype=float32)tf.Tensor([3.581475  1.132276  5.6670904 6.712369  3.2565057 1.7095459 8.468903 6.2697005 1.0973608 2.7732193], shape=(10,), dtype=float32)tf.Tensor([3.581475  1.132276  5.6670904 6.712369  3.2565057 1.7095459 8.468903 6.2697005 1.0973608 2.7732193], shape=(10,), dtype=float32)