我有一个关于TF2.0中的tf.keras和tf函数的问题。如果我有一个这样的模型:
inputdata = keras.Input(shape=(2048, 1))
x = layers.Conv1D(16, 3, activation='relu')(inputdata)
x = layers.Conv1D(32, 3, activation='relu')(x)
x = layers.Conv1D(64, 3, activation='relu')(x)我想添加一个这样的自定义函数,它是一个1D的SubPixel层:
def SubPixel1D(I, r):
with tf.name_scope('subpixel'):
X = tf.transpose(I, [2,1,0]) # (r, w, b)
X = tf.batch_to_space_nd(X, [r], [[0,0]]) # (1, r*w, b)
X = tf.transpose(X, [2,1,0])
return X我可以没有问题地在keras中包含这一层吗?由于tensorflow 2.0比以前的tensorflow版本简单得多,所以我不确定这是不是把后端和会话搞混了?
inputdata = keras.Input(shape=(2048, 1))
x = layers.Conv1D(16, 3, activation='relu')(inputdata)
x = layers.Conv1D(32, 3, activation='relu')(x)
x = SubPixel1D(x,2)
x = layers.Conv1D(64, 3, activation='relu')(x)之后,编译和拟合模型将会工作吗?如果导入了tensorflow和keras
import tensorflow as tf
from tensorflow import keras类似于keras中的自定义损失函数。如果我像这样定义一个自定义损失函数:
def my_loss(y_true, y_pred):
# compute l2 loss/ equal to Keras squared mean
sqrt_l2_loss = tf.reduce_mean((y_pred-y_true)**2, axis=[1, 2])
avg_sqrt_l2_loss = tf.reduce_mean(sqrt_l2_loss, axis=0)
return avg_sqrt_l2_loss并使用tf。操作或函数,我能像往常一样把这个函数传递给keras吗?我能在Keras loss中使用它吗?
发布于 2020-03-03 18:53:32
只需将tf.keras.Layer子类化,您就可以运行了。这里有一个很好的参考:https://www.tensorflow.org/guide/keras/custom_layers_and_models。您的图层应该如下所示:
class SubPixel1D(tf.keras.layers.Layer):
def __init__(self, r)
super(SubPixel1D, self).__init__()
self.r = r
def call(self, inputs):
with tf.name_scope('subpixel'):
X = tf.transpose(inputs, [2,1,0]) # (r, w, b)
X = tf.batch_to_space_nd(X, [self.r], [[0,0]]) # (1, r*w, b)
X = tf.transpose(X, [2,1,0])
return X然后在定义模型时调用它
inputdata = keras.Input(shape=(2048, 1))
x = layers.Conv1D(16, 3, activation='relu')(inputdata)
x = layers.Conv1D(32, 3, activation='relu')(x)
x = SubPixel1D(2)(x)
x = layers.Conv1D(64, 3, activation='relu')(x)我不知道tf.name_scope会有什么样的表现,但我看不出有什么直接的问题。
https://stackoverflow.com/questions/60505369
复制相似问题