我想训练大小为(10151,1285)的特征到标签(10151,257),并且我想使用way2。因为在中我想在成本函数中使用"feature_input“。但它失败了,并出现错误:
ValueError:层batch_normalization_6的输入0与层不兼容:需要的是ndim=3,找到的是ndim=2。收到的完整形状:(None,257)。
我想知道为什么?
Way1:
model = Sequential()
model.add(Dense(257, input_dim=1285))
model.add(BatchNormalization())
model.add(Activation('sigmoid'))
model.compile(optimizer='adam', loss='mse', metrics=['mse'])
model.fit(feature, label )
model.save("./model.hdf5")
Way2:
feature_input = Input(shape=(None, 1285))
dense = Dense(257)(feature_input)
norm = BatchNormalization()(dense)
out = Activation('sigmoid')(norm)
model = Model(feature_input, out)
model.compile(optimizer='adam', loss='mse', metrics=['mse'])
model.fit(feature, label )
model.save("./model.hdf5")发布于 2021-02-02 15:35:40
如果将输入形状定义为(None, 1285),则模型会将输入识别为三维数据。我猜您输入的None形状是用来描述批量大小的,但是当我们编译模型时,我们得到了一个3维的输入,并且自动添加了批量维度。因此,您可以使用(1285,)的输入形状作为替代。
<Summary of your model>
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, None, 1285)] 0
_________________________________________________________________
dense (Dense) (None, None, 257) 330502
_________________________________________________________________
batch_normalization (BatchNo (None, None, 257) 1028
_________________________________________________________________
activation (Activation) (None, None, 257) 0
=================================================================
Total params: 331,530
Trainable params: 331,016
Non-trainable params: 514
_________________________________________________________________https://stackoverflow.com/questions/66005072
复制相似问题