我今天收到警告,上面说
softmax_cross_entropy_with_logits (from tensorflow.python.ops.nn_ops) is deprecated and will be removed in a future version. Instructions for updating:
Future major versions of TensorFlow will allow gradients to flow into the labels input on backprop by default.
然后我按照它的建议检查了tf.nn.softmax_cross_entropy_with_logits_v2
,发现了以下内容
Backpropagation will happen into both logits and labels. To disallow backpropagation into labels, pass label tensors through a stop_gradients before feeding it to this function.
我不太明白反向传播会发生在标签中意味着什么,标签不是被设置为常量吗?在
看看下面的链接,有人问了一个类似的问题,得到了一个很好的答案。在
https://stats.stackexchange.com/questions/327348/how-is-softmax-cross-entropy-with-logits-different-from-softmax-cross-entropy-wi
相关问题 更多 >
编程相关推荐