如何将中间张量传递到while循环体内的神经网络层?

2024-05-13 23:04:22 发布

您现在位置:Python中文网/ 问答频道 /正文

我想把一个中间张量结果传递到while循环体中一个完全连接的神经网络中。你知道吗

我的问题是神经网络的实例化。 我的第一次尝试是为NN创建一个占位符,然后使用一个会话实例为它提供信息。它不起作用,因为feed不接受张量。考虑到图形的数据流性质,这是很公平的。你知道吗

在第二次尝试中,我在循环体中实例化了NN,并直接传递中间张量。 但是,当我这样做时,会显示以下stacktrace:

Caused by op u'net/fc1_W/read', defined at:
  File "./main.py", line 176, in <module>
    offline_indexing(sys.argv[1])
  File "./main.py", line 128, in offline_indexing
    test.run(a, f)
  File "/home/lsv/Desktop/gitlab/Gencoding/test.py", line 86, in run
    print sess.run(graph_embed(), feed_dict={adj: x, features: y})
  File "/home/lsv/Desktop/gitlab/Gencoding/test.py", line 73, in graph_embed
    final_mus = tf.while_loop(cond, body, [mus, features, adj, 0])[0]
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/control_flow_ops.py", line 3291, in while_loop
    return_same_structure)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/control_flow_ops.py", line 3004, in BuildLoop
    pred, body, original_loop_vars, loop_vars, shape_invariants)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/control_flow_ops.py", line 2939, in _BuildLoop
    body_result = body(*packed_vars_for_body)
  File "/home/lsv/Desktop/gitlab/Gencoding/test.py", line 63, in body
    h2 = fc_NN(mu_neigh_sum)
  File "/home/lsv/Desktop/gitlab/Gencoding/test.py", line 53, in fc_NN
    fc1 = fc_layer(ac2, 64, "fc1")
  File "/home/lsv/Desktop/gitlab/Gencoding/test.py", line 43, in fc_layer
    W = tf.get_variable(name+'_W', dtype=tf.float32, shape=[embedding_size, embedding_size], initializer=initer)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/variable_scope.py", line 1487, in get_variable
    aggregation=aggregation)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/variable_scope.py", line 1237, in get_variable
    aggregation=aggregation)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/variable_scope.py", line 540, in get_variable
    aggregation=aggregation)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/variable_scope.py", line 492, in _true_getter
    aggregation=aggregation)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/variable_scope.py", line 922, in _get_single_variable
    aggregation=aggregation)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/variables.py", line 183, in __call__
    return cls._variable_v1_call(*args, **kwargs)
  File "/home/lsv/.local/lib/python2.7/site-packages/tensorflow/python/ops/variables.py", line 146, in _variable_v1_call
    aggregation=aggregation)

FailedPreconditionError (see above for traceback): Attempting to use uninitialized value net/fc1_W
     [[node net/fc1_W/read (defined at /home/lsv/Desktop/gitlab/Gencoding/test.py:43)  = Identity[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"](net/fc1_W)]]

请注意,使用密集层也会导致相同的错误。 这是我的密码:

def fc_layer(bottom, n_weight, name):
    assert len(bottom.get_shape()) == 2
    n_prev_weight = bottom.get_shape()[1]
    initer = tf.truncated_normal_initializer(stddev=0.01)
    W = tf.get_variable(name+'_W', dtype=tf.float32, shape=[embedding_size, embedding_size], initializer=initer)
    b = tf.get_variable(name+'_b', dtype=tf.float32, shape=[embedding_size], initializer=tf.zeros_initializer)
    fc = tf.nn.bias_add(tf.matmul(bottom, W), b)
    return fc

def fc_NN(x):
    fc2 = fc_layer(x, 64, "fc2")
    ac2 = tf.nn.relu(fc2)
    fc1 = fc_layer(ac2, 64, "fc1")
    return fc1
fcnn = fc_NN()

def cond(m, f, a, i):
    return tf.less(i, T)

def body(m, f, a, i):
    mu_neigh_sum = tf.tensordot(a, m, 1)
    h1 = tf.matmul(f, W1)
    # First incriminating try - with x as a placeholder
    h2 = tf.Session().run(fcnn, {x: mu_neigh_sum})
    # Second incriminating try - with x as a tensor
    h2bis = fc_NN(mu_neigh_sum)

    return tf.tanh(h1 + h2), f, a, i+1

final_mus = tf.while_loop(cond, body, [mus, features, adj, 0])[0]

Tags: inpyhomelibpackageslocaltfline