如何在returnn中的新网络中加载经过训练的网络的某一层的权重?

2024-04-23 09:30:02 发布

您现在位置:Python中文网/ 问答频道 /正文

我在/to/modelFile文件夹中有以下网络的训练权重:

network={
"conv_1" : {"class": "conv", "filter_size": (400,), "activation":"abs" , "padding": "valid", "strides": 10, "n_out": 64 },
"pad_conv_1_time_dim" : {"class": "pad", "axes": "time", "padding": 20, "from": ["conv_1"]},
"conv_2" : {"class": "conv", "input_add_feature_dim": True, "filter_size": (40, 64), "activation":"abs", "padding": "valid","strides": 16, "n_out": 128, "from": ["pad_conv_1_time_dim"]},
"flatten_conv": {"class": "merge_dims", "axes": "except_time","n_out": 128,  "from": ["conv_2"]},
"window_1": {"class": "window", "window_size": 17, "from": ["flatten_conv"]},
"flatten_window": {"class": "merge_dims", "axes":"except_time","from": ["window_1"]},
"lin_1" :   { "class" : "linear", "activation": None, "n_out": 512,"from" : ["flatten_window"] },
"ff_2" :   { "class" : "linear", "activation": "relu", "n_out": 2000, "from" : ["lin_1"] },
"output" :   { "class" : "softmax", "loss" : "ce", "from" : ["ff_2"] }
}

我想把“conv_1”和“conv_2”层的训练权重加载到以下网络中:

network={
"conv_1" : {"class": "conv", "filter_size": (400,), "activation": "abs" , "padding": "valid", "strides": 10, "n_out": 64 },
"pad_conv_1_time_dim" : {"class": "pad", "axes": "time", "padding": 20, "from": ["conv_1"]},
"conv_2" : {"class": "conv", "input_add_feature_dim": True, "filter_size": (40, 64), "activation":"abs", "padding": "valid", "strides": 16, "n_out": 128, "from": ["pad_conv_1_time_dim"]},
"flatten_conv": {"class": "merge_dims", "axes": "except_time", "n_out": 128,  "from": ["conv_2"]},
"lstm1_fw" : { "class": "rec", "unit": "lstmp", "n_out" : rnnLayerNodes, "direction": 1, "from" : ['flatten_conv'] },
"lstm1_bw" : { "class": "rec", "unit": "lstmp", "n_out" : rnnLayerNodes, "direction": -1, "from" : ['flatten_conv'] },
"lin_1" :   { "class" : "linear", "activation": None, "n_out": 512, "from" : ["lstm1_fw", "lstm1_bw"] },
"ff_2" :   { "class" : "linear", "activation": "relu", "n_out": 2000, "from" : ["lin_1"] },
"ff_3" :   { "class" : "linear", "activation": "relu", "n_out": 2000,"from" : ["ff_2"] },
"ff_4" :   { "class" : "linear", "activation": "relu", "n_out": 2000,"from" : ["ff_3"] },
"output" :   { "class" : "softmax", "loss" : "ce", "from" : ["ff_4"] }
}

在returnn怎么可能?你知道吗


Tags: fromsizetimeoutwindowactivationclasslinear
1条回答
网友
1楼 · 发布于 2024-04-23 09:30:02

使用SubnetworkLayer是一种选择。这看起来像:

trained_network_model_file = 'path/to/model_file'

trained_network = {
"conv_1" : {"class": "conv", "filter_size": (400,), "activation": "abs" , "padding": "valid", "strides": 10, "n_out": 64 },
"pad_conv_1_time_dim" : {"class": "pad", "axes": "time", "padding": 20, "from": ["conv_1"]},
"conv_2" : {"class": "conv", "input_add_feature_dim": True, "filter_size": (40, 64), "activation":"abs", "padding": "valid", "strides": 16, "n_out": 128, "from": ["pad_conv_1_time_dim"]},
"flatten_conv": {"class": "merge_dims", "axes": "except_time","n_out": 128,  "from": ["conv_2"]}
}

network = {
"conv_layers" : { "class" : "subnetwork", "subnetwork": trained_network, "load_on_init": trained_network_model_file, "n_out": 128},
"lstm1_fw" : { "class": "rec", "unit": "lstmp", "n_out" : rnnLayerNodes, "direction": 1, "from" : ['conv_layers'] },
"lstm1_bw" : { "class": "rec", "unit": "lstmp", "n_out" : rnnLayerNodes, "direction": -1, "from" : ['conv_layers'] },
"lin_1" :   { "class" : "linear", "activation": None, "n_out": 512, "from" : ["lstm1_fw", "lstm1_bw"] },
"ff_2" :   { "class" : "linear", "activation": "relu", "n_out": 2000, "from" : ["lin_1"] },
"ff_3" :   { "class" : "linear", "activation": "relu", "n_out": 2000, "from" : ["ff_2"] },
"ff_4" :   { "class" : "linear", "activation": "relu", "n_out": 2000, "from" : ["ff_3"] },
"output" :   { "class" : "softmax", "loss" : "ce", "from" : ["ff_4"] }
}

我想对你来说这是我的首选。你知道吗

否则,每个层都有custom_param_importer选项,您可以让它使用它。你知道吗

然后,对于许多层,可以为参数定义初始值设定项,例如,对于ConvLayer,可以使用forward_weights_init。可以使用load_txt_file_initializer之类的函数,或者可以添加类似的函数直接从TF检查点文件加载。你知道吗

相关问题 更多 >