<p>当前版本的代码将在每次调用<code>sess.run()</code>时为<code>rand_var_1</code>和<code>rand_var_2</code>随机生成一个新值(尽管由于您将seed设置为0,它们在单个调用<code>sess.run()</code>中具有相同的值)。</p>
<p>如果要保留随机生成的张量的值以供以后使用,则应将其指定给<a href="https://www.tensorflow.org/versions/master/api_docs/python/state_ops.html#Variable" rel="noreferrer">^{<cd5>}</a>:</p>
<pre><code>rand_var_1 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
rand_var_2 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
# Or, alternatively:
rand_var_1 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
rand_var_2 = tf.Variable(rand_var_1.initialized_value())
# Or, alternatively:
rand_t = tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0)
rand_var_1 = tf.Variable(rand_t)
rand_var_2 = tf.Variable(rand_t)
</code></pre>
<p>…然后<a href="https://www.tensorflow.org/versions/master/api_docs/python/state_ops.html#initialize_all_variables" rel="noreferrer">^{<cd6>}</a>将产生所需的效果:</p>
<pre><code># Op 1
z1 = tf.add(rand_var_1, rand_var_2)
# Op 2
z2 = tf.add(rand_var_1, rand_var_2)
init = tf.initialize_all_variables()
with tf.Session() as sess:
sess.run(init) # Random numbers generated here and cached.
z1_op = sess.run(z1) # Reuses cached values for rand_var_1, rand_var_2.
z2_op = sess.run(z2) # Reuses cached values for rand_var_1, rand_var_2.
print(z1_op, z2_op) # Will print two identical vectors.
</code></pre>