请注意:这个问题在github上也以issue的形式存在。在
更新:
我尝试在Keras中实现一个自定义层:GRU层,它通过卷积计算门。 代码可以工作,但只能在no后端使用。在
在github上,我得到了将“图像布局”切换到“tf”的提示keras.json文件. 事实上,我已经硬编码了“th”图像布局,所以我不得不用代码修复一些问题。 现在它同时支持“th”和“tf”布局。 模型也被修改了:现在我有两个模型,一个是“th”的,一个是“tf”的。在
这完全改写了这个问题。问题是一样的,但代码是新的。在
有了no作为后端,两个图像布局都可以正常运行。网络可以从数据中学习,新的层似乎可以工作。我相当确信我的“th”和“tf”布局的实现都是正确的。 但当我切换到tensorflow时,它总是崩溃。布局似乎没有效果。在
代码中有一个诀窍:GRU使用卷积来计算其门。但是这个卷积不包括在层中。我只是假设每次使用这个层时,前面都有一个2D卷积,它产生3个输出特性(每个门对应一个)。 代码如下:
^{1}$很抱歉给你大量的代码,我已经尽可能的对它进行注释和格式化。在
这是设置,这是我如何使用代码:
^{pr2}$使用“tf”图像布局的相同代码在theano下运行良好,并与tensorflow一起崩溃。在
问题是:ValueError: Shapes (?, ?, 40, 40) and (40, ?, 40) are not compatible
以下是完整的错误消息:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-8-295ff1bf414f> in <module>()
25 # now a model with the two GRUs which move in different directions
26 time_dist=Reshape((40,40,40, 3))(time_dist)
---> 27 up=CGRU(go_backwards=False, return_sequences=True, name="up", input_shape=[40,40,40,3])(time_dist)
28 down=CGRU(go_backwards=True, return_sequences=True, name="down", input_shape=[40,40,40,3])(time_dist)
29
/usr/local/lib/python2.7/dist-packages/keras/engine/topology.pyc in __call__(self, x, mask)
513 if inbound_layers:
514 # this will call layer.build() if necessary
--> 515 self.add_inbound_node(inbound_layers, node_indices, tensor_indices)
516 input_added = True
517
/usr/local/lib/python2.7/dist-packages/keras/engine/topology.pyc in add_inbound_node(self, inbound_layers, node_indices, tensor_indices)
571 # creating the node automatically updates self.inbound_nodes
572 # as well as outbound_nodes on inbound layers.
--> 573 Node.create_node(self, inbound_layers, node_indices, tensor_indices)
574
575 def get_output_shape_for(self, input_shape):
/usr/local/lib/python2.7/dist-packages/keras/engine/topology.pyc in create_node(cls, outbound_layer, inbound_layers, node_indices, tensor_indices)
148
149 if len(input_tensors) == 1:
--> 150 output_tensors = to_list(outbound_layer.call(input_tensors[0], mask=input_masks[0]))
151 output_masks = to_list(outbound_layer.compute_mask(input_tensors[0], input_masks[0]))
152 # TODO: try to auto-infer shape if exception is raised by get_output_shape_for
/usr/local/lib/python2.7/dist-packages/keras/layers/recurrent.pyc in call(self, x, mask)
211 constants=constants,
212 unroll=self.unroll,
--> 213 input_length=input_shape[1])
214 if self.stateful:
215 self.updates = []
/usr/local/lib/python2.7/dist-packages/keras/backend/tensorflow_backend.pyc in rnn(step_function, inputs, initial_states, go_backwards, mask, constants, unroll, input_length)
1193 parallel_iterations=32,
1194 swap_memory=True,
-> 1195 sequence_length=None)
1196
1197 if nb_states > 1:
/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/rnn.pyc in _dynamic_rnn_loop(cell, inputs, initial_state, parallel_iterations, swap_memory, sequence_length, dtype)
1023 shape = _state_size_with_prefix(
1024 output_size, prefix=[const_time_steps, const_batch_size])
-> 1025 output.set_shape(shape)
1026
1027 final_outputs = nest.pack_sequence_as(
/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/ops.pyc in set_shape(self, shape)
406 this tensor.
407 """
--> 408 self._shape = self._shape.merge_with(shape)
409
410 @property
/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/tensor_shape.pyc in merge_with(self, other)
568 except ValueError:
569 raise ValueError("Shapes %s and %s are not compatible" %
--> 570 (self, other))
571
572 def concatenate(self, other):
ValueError: Shapes (?, ?, 40, 40) and (40, ?, 40) are not compatible
目前没有回答
相关问题 更多 >
编程相关推荐