我正在尝试为我的LSTM语言模型实现一个困惑损失函数。然而,我得到了以下错误:
InvalidArgumentError: logits and labels must have the same first dimension, got logits shape [32,3345] and labels shape [107040]
[[{{node loss_9/dense_10_loss/perplexity/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits}}]]现在,我认为解决这个问题的方法是对我的日志进行一次热编码,但我不确定如何做到这一点,也就是说,我不知道如何访问我的日志,也不知道我应该对它们进行多大深度的编码。
我的损失函数如下所示:
import keras.losses
from keras import backend as K
def perplexity(y_true, y_pred):
"""
The perplexity metric. Why isn't this part of Keras yet?!
https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
https://github.com/keras-team/keras/issues/8267
"""
cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
perplexity = K.exp(cross_entropy)
return perplexity我将我的LSTM模型定义如下:
# define model
model = Sequential()
model.add(Embedding(vocab_size, 500, input_length=max_length-1))
model.add(LSTM(750))
model.add(Dense(vocab_size, activation='softmax'))
print(model.summary())
# compile network
model.compile(loss=perplexity, optimizer='adam', metrics=['accuracy'])
# fit network
model.fit(X, y, epochs=150, verbose=2)发布于 2020-06-01 18:17:22
如果使用sparse_categorical_crossentropy,则输出必须是简单的整数编码
def perplexity(y_true, y_pred):
cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
perplexity = K.exp(cross_entropy)
return perplexity
vocab_size = 10
X = np.random.uniform(0,1, (1000,10))
y = np.random.randint(0,vocab_size, 1000)
model = Sequential()
model.add(Dense(64, activation='relu', input_dim=(10)))
model.add(Dense(vocab_size, activation='softmax'))
# compile network
model.compile(loss=perplexity, optimizer='adam', metrics=['accuracy'])
# fit network
model.fit(X, y, epochs=10, verbose=2)如果您有独热编码目标,请注意在K.categorical_crossentropy中更改K.sparse_categorical_crossentropy
def perplexity(y_true, y_pred):
cross_entropy = K.categorical_crossentropy(y_true, y_pred)
perplexity = K.exp(cross_entropy)
return perplexity
vocab_size = 10
X = np.random.uniform(0,1, (1000,10))
y = pd.get_dummies(np.random.randint(0,vocab_size, 1000)).values # one-hot
model = Sequential()
model.add(Dense(64, activation='relu', input_dim=(10)))
model.add(Dense(vocab_size, activation='softmax'))
# compile network
model.compile(loss=perplexity, optimizer='adam', metrics=['accuracy'])
# fit network
model.fit(X, y, epochs=10, verbose=2)https://stackoverflow.com/questions/62129733
复制相似问题