这是我的代码中引发运行时错误的部分。当我调用x=Bidirectional并添加嵌入时,就会发生这种情况。有人能帮我吗?在
elmo_model = hub.Module("https://tfhub.dev/google/elmo/2", trainable=True) # For fine-tuning the model
sess.run(tf.compat.v1.global_variables_initializer())
sess.run(tf.compat.v1.tables_initializer())
# Define function to apply elmo embedding model
def ElmoEmbedding(x):
return elmo_model(inputs={
"tokens": tf.squeeze(tf.cast(x, tf.string)),
"sequence_len": tf.constant(batchSize * [maxSentLen])
},
signature="tokens",
as_dict=True)["elmo"]
inputLayer = Input(shape=(maxSentLen,), dtype="string")
embedding = Lambda(ElmoEmbedding, output_shape=(maxSentLen, embeddingDimension))(inputLayer)
十一
xRNN = Bidirectional(LSTM(units=nUnits, return_sequences=True, recurrent_dropout=recurrentDropOutRate, dropout=dropOutRate))(x)
x = add([x, xRNN])
out = TimeDistributed(Dense(nTags, activation="softmax"))(x)
model = Model(inputLayer, out)
# return model
model = build_model()
目前没有回答
相关问题 更多 >
编程相关推荐