我试图在Keras中使用递归车间编写一个自定义递归层。输入是一个长度为“时间步骤”的序列,输出应该是所有时间步骤输出的序列,因此我使用return_sequences=True。
递归模型有两个输入,其中一个是从最后一步开始的递归输入。我使用了类似于reccurentshop示例的这个层,但是我不断地收到错误:“您必须为占位符张量'input_2‘提供一个带有dtype float的值”。我做错了什么?完整的守则如下:
import keras.backend as K
from recurrentshop import RecurrentModel
import numpy as np
from keras.models import Model
from keras.layers import Dense, Reshape, Conv1D, Input, Lambda, concatenate
from keras.optimizers import Adam
# parameters:
timesteps = 35
output_dim = 315
input_dim = 10
batch_size = 100
# recurrent layer definition:
def myRNN(input_dim,output_dim):
inp = Input((input_dim,))
h_tm1 = Input((output_dim,))
modified_h = Lambda(lambda x: x * K.sum(K.square(inp)))(h_tm1)
modified_inp = Dense(output_dim, use_bias=False, activation='tanh')(inp)
modified_inp = Reshape((output_dim,1))(modified_inp)
modified_inp = Conv1D(128, 7, padding='same', activation='tanh', use_bias=False)(modified_inp)
modified_inp = Lambda(lambda x: K.sum(x, axis=-1))(modified_inp)
hid = concatenate([modified_h, modified_inp], axis=-1)
h_t = Dense(output_dim, use_bias=False, activation='tanh')(hid)
return RecurrentModel(input=inp, output=h_t, initial_states=h_tm1, final_states=h_t,
return_sequences=True, state_initializer=['zeros'])
# building the model:
inp = Input((timesteps, input_dim))
temp = myRNN(input_dim,output_dim)(inp)
out = Reshape((timesteps*output_dim,1))(temp)
model = Model(inputs=inp, outputs=out)
model.compile(loss='mse', optimizer='adam')
# testing the model:
inp = np.random.rand(batch_size ,timesteps ,input_dim)
prediction = model.predict(inp)发布于 2018-05-04 05:52:48
原来问题出在Lambda层:
modified_h = Lambda(lambda x: x * K.sum(K.square(inp)))(h_tm1)通过定义函数来解决这个问题:
def factored_h(arg):
norm = K.sum(K.square(arg[0]))
return arg[1]*norm并将Lambda层重新措辞如下:
modified_h =Lambda(factored_h)([inp, h_tm1])https://stackoverflow.com/questions/50133081
复制相似问题