关于tf的一些问题。
import numpy as np
import tensorflow as tf
from tensorflow import keras
x_train = [1,2,3]
y_train = [1,2,3]
W = tf.Variable(tf.random.normal([1]), name = 'weight')
b = tf.Variable(tf.random.normal([1]), name = 'bias')
hypothesis = W*x_train+b
optimizer = tf.optimizers.SGD (learning_rate=0.01)
train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])当我开始我的代码的最后一行时,出现了下面的错误。
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-52-cd6e22f66d09> in <module>()
----> 1 train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])
1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py in _compute_gradients(self, loss, var_list, grad_loss, tape)
530 # TODO(josh11b): Test that we handle weight decay in a reasonable way.
531 if not callable(loss) and tape is None:
--> 532 raise ValueError("`tape` is required when a `Tensor` loss is passed.")
533 tape = tape if tape is not None else backprop.GradientTape()
534
ValueError: `tape` is required when a `Tensor` loss is passed.我知道它与tensorflow版本2相关,但不想修改为版本1。
需要tensorflow ver2的解决方案。谢谢。
发布于 2021-01-27 13:22:42
由于您没有提供成本函数,因此我添加了一个。以下是代码
import numpy as np
import tensorflow as tf
from tensorflow import keras
x_train = [1,2,3]
y_train = [1,2,3]
W = tf.Variable(tf.random.normal([1]), name = 'weight')
b = tf.Variable(tf.random.normal([1]), name = 'bias')
hypothesis = W*x_train+b
@tf.function
def cost():
y_model = W*x_train+b
error = tf.reduce_mean(tf.square(y_train- y_model))
return error
optimizer = tf.optimizers.SGD (learning_rate=0.01)
train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])
tf.print(W)
tf.print(b)https://stackoverflow.com/questions/65913108
复制相似问题