我想实现一个自定义层。基本上我想用一个更小的向量子集。所以我把重复的向量作为一批输入到我的自定义层。我认为这将是很容易的循环通过每批样品和重新组装一个相同的形状结果张量。看来不是。所以我想知道怎样才能达到预期的效果
import tensorflow as tf
from keras import backend as K
from keras.initializers import Constant
from keras.layers import Layer
class MyLayer(Layer):
def __init__(self):
super().__init__()
def build(self, input_shape):
self.kernel = self.add_weight(name='kernel',
shape=(self.parameters,),
initializer=Constant(0.5),
trainable=True)
# Be sure to call this at the end
super().build(input_shape)
def call(self, x, **kwargs):
res = []
# does not work, None Type is not iterable
# for sample in range(K.shape(x)[0]):
for sample in range(2):
xs = x[sample][(sample * 5):]
# simulate some verbose math on xs ..
calc = K.zeros(xs.shape)
res.append(K.concatenate([x[0][:(sample * 5)], calc]))
# fails as well :-(
# InvalidArgumentError: slice index 1 of dimension 0 out of bounds. [[{{node lppl_layer_74/strided_slice_12}}]]
return K.stack(res)
为了适应模型,我只使用了一些垃圾数据:
model = Sequential([MyLayer()])
model.compile(loss='mse', optimizer=SGD(0.2, 0.01))
x = np.random.random((100,)
x = x.reshape(1, -1)
x2 = np.vstack([x, x])
model.fit(x2, x2, epochs=5000, verbose=0, batch_size=x.shape[0], callbacks=[EarlyStopping('loss')])
目前没有回答
相关问题 更多 >
编程相关推荐