如果损失来自于sparse_softmax_cross_entropy_with_logits.,我如何缩放梯度?例如,我试图将下面的128除以,但我发现了错误:
new_gradients = [(grad/128, var) for (grad, var) in gradients]
TypeError: unsupported operand type(s) for /: 'IndexedSlices' and 'int'我使用的代码如下:
loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits, labels=labels)
gradients = opt.compute_gradient(loss)
new_gradients = [(grad/128, var) for (grad, var) in gradients]
train_step = opt.appy_gradients(new_gradients)发布于 2018-02-03 00:43:51
我找到了解决这个问题的方法如下:
new_gradients = [(grad/128, var) for (grad, var) in gradients]应该是
new_gradients = [(tf.div(grad, 128), var) for (grad, var) in gradients]https://stackoverflow.com/questions/48350151
复制相似问题