我在尝试复制一个神经网络。神经网络的体系结构是LSTM模型。第一输入是作为大小为2^18的二进制向量的散列字,其嵌入在使用嵌入层的可训练的500维分布式表示中。
字数与每个批处理元素不同。在嵌入和删除单词之后,我需要与特征向量连接,特征向量每个单词都有24个特征。
问题是,第一次输入后,包含不同的特征向量维数。“拥抱”一词有一个维度(无,无,18,500),特征保护有维度(无,无,24)。第一个是批次的大小,第二个是每个批次的字数。
如何将包含词与特征向量连在一起?
下面是我的代码:
inputs = Input(shape=(None, 18,), dtype=np.int16, name="Inp1")
embbed_input = Embedding(input_dim=1, output_dim=500, input_length=18)
aux = embbed_input(inputs)
aux = Dropout(rate=self.dropout_rate)(aux)
inputs_feat = Input(shape=(None, 24,), dtype=np.float32, name="Inp2")
aux = concatenate([aux, inputs_feat], axis=2) #ValueError here
aux = Dense(units=600, activation="relu")(aux)
aux = Dense(units=600, activation="relu")(aux)
aux = Bidirectional(LSTM(units=400, return_sequences=True))(aux)
aux = Dropout(rate=self.dropout_rate)(aux)
aux = Dense(units=600, activation="relu")(aux)
aux = Dense(units=600, activation="relu")(aux)
aux = Dense(units=29, activation="sigmoid")(aux)
ValueError: A 'Concatenate' layer require inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, None, 18, 500), (None, None, 24)]发布于 2020-04-06 12:32:27
由于单步输入包含18个嵌入层的输入,我们可能需要对嵌入层的输出进行整形,以使最后的两个维度变平。
# import Reshape layer
from tf.keras.layers import Reshape
inputs = Input(shape=(None, 18,), dtype=np.int16, name="Inp1")
embbed_input = Embedding(input_dim=1, output_dim=500, input_length=18)
aux = embbed_input(inputs)
# Note: it is generally not a great idea to add dropout just after the embedding layer
aux = Dropout(rate=self.dropout_rate)(aux)
# before the concatenate layer, reshape it to (None, None, 500*18)
aux = Reshape(target_shape=(-1,-1,500*18))(aux)
inputs_feat = Input(shape=(None, 24,), dtype=np.float32, name="Inp2")
aux = concatenate([aux, inputs_feat], axis=2) # no need to specify axis when it is the final dimension
aux = Dense(units=600, activation="relu")(aux)
aux = Dense(units=600, activation="relu")(aux)
aux = Bidirectional(LSTM(units=400, return_sequences=True))(aux)
aux = Dropout(rate=self.dropout_rate)(aux)
aux = Dense(units=600, activation="relu")(aux)
aux = Dense(units=600, activation="relu")(aux)
aux = Dense(units=29, activation="sigmoid")(aux)https://stackoverflow.com/questions/61058494
复制相似问题