在张力板图中看不到脱落

2024-05-23 17:10:58 发布

您现在位置:Python中文网/ 问答频道 /正文

我以以下方式将dropout应用于第一层:

out = layers.fully_connected(X, num_outputs=hidden[0], activation_fn=None)
out1 = tf.nn.relu(out)
drop_out = tf.nn.dropout(out1, keep_prob) #keep_prob = 0.5

out2 = layers.fully_connected(drop_out, num_outputs=hidden[1], activation_fn=None)
out2 = tf.nn.relu(out2)

out3 = layers.fully_connected(out4, num_outputs=num_actions, activation_fn=None)

但我看不到张力板计算图中的辍学现象。我是否需要在摘要中显式地编写一个命令来包含Dropout层?你知道吗

Tensorboard graph for the above model


Tags: nonelayerstfnnoutoutputsactivationnum