我以以下方式将dropout应用于第一层:
out = layers.fully_connected(X, num_outputs=hidden[0], activation_fn=None)
out1 = tf.nn.relu(out)
drop_out = tf.nn.dropout(out1, keep_prob) #keep_prob = 0.5
out2 = layers.fully_connected(drop_out, num_outputs=hidden[1], activation_fn=None)
out2 = tf.nn.relu(out2)
out3 = layers.fully_connected(out4, num_outputs=num_actions, activation_fn=None)
但我看不到张力板计算图中的辍学现象。我是否需要在摘要中显式地编写一个命令来包含Dropout层?你知道吗
目前没有回答
相关问题 更多 >
编程相关推荐