使用Keras连接隐藏单元

2024-04-26 07:07:50 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在尝试连接隐藏的单位。例如,我有3个单位,h1,h2,h3,然后我希望新层有[h1;h1],[h1;h2],[h1;h3],[h2;h1]...。你知道吗

所以,我试过:

class MyLayer(Layer):
    def __init__(self,W_regularizer=None,W_constraint=None, **kwargs):
        self.init = initializers.get('glorot_uniform')
        self.W_regularizer = regularizers.get(W_regularizer)
        self.W_constraint = constraints.get(W_constraint)
        super(MyLayer, self).__init__(**kwargs)

def build(self, input_shape):
    assert len(input_shape) == 3
    # Create a trainable weight variable for this layer.
    self.W = self.add_weight((input_shape[-1],input_shape[-1]),
                             initializer=self.init,
                             name='{}_W'.format(self.name),
                             regularizer=self.W_regularizer,
                             constraint=self.W_constraint,
                            trainable=True)
    super(MyLayer, self).build(input_shape)

def call(self, x,input_shape):
    conc=K.concatenate([x[:, :-1, :], x[:, 1:, :]],axis=1)# help needed here
    uit = K.dot(conc, self.W)# W has input_shape[-1],input_shape[-1]
    return uit

def compute_output_shape(self, input_shape):
    return input_shape[0], input_shape[1],input_shape[-1]

我不确定应该为输出形状的第二个参数返回什么。你知道吗

from keras.layers import Input, Lambda, LSTM
from keras.models import Model
import keras.backend as K
from keras.layers import Lambda

lstm=LSTM(64, return_sequences=True)(input)
something=MyLayer()(lstm)

Tags: fromimportselfinputgetreturninitdef
2条回答

您可以通过利用itertools.product来实现所描述的连接操作,以便计算时间维度索引的笛卡尔积。调用方法的编码如下:

def call(self, x):
    prod = product(range(nb_timesteps), repeat=2)
    conc_prod = []
    for i, j in prod:
        c = K.concatenate([x[:, i, :], x[:, j, :]], axis=-1)  # Shape=(batch_size, 2*nb_features)
        c_expanded = c[:, None, :]  # Shape=(batch_size, 1, 2*nb_features)
        conc_prod.append(c_expanded)
    conc = K.concatenate(conc_prod, axis=1)  # Shape=(batch_size, nb_timesteps**2, 2*nb_features)
    uit = K.dot(conc, self.W)  # W has shape 2*input_shape[-1], input_shape[-1]
    return uit  # Shape=(batch_size, nb_timesteps**2, nb_features)

在您提供的example中,nb_timesteps将是3。还要注意,权重的形状应该是(2*input_shape[-1], input_shape[-1]),这样点积才是有效的。你知道吗

免责声明:我不确定您想要实现什么目标。你知道吗

相关问题 更多 >