在previous question中,探索了serving_input_receiver_fn
的目的和结构,在answer中:
def serving_input_receiver_fn():
"""For the sake of the example, let's assume your input to the network will be a 28x28 grayscale image that you'll then preprocess as needed"""
input_images = tf.placeholder(dtype=tf.uint8,
shape=[None, 28, 28, 1],
name='input_images')
# here you do all the operations you need on the images before they can be fed to the net (e.g., normalizing, reshaping, etc). Let's assume "images" is the resulting tensor.
features = {'input_data' : images} # this is the dict that is then passed as "features" parameter to your model_fn
receiver_tensors = {'input_data': input_images} # As far as I understand this is needed to map the input to a name you can retrieve later
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
answer的作者声明(关于receiver_tensors
):
As far as I understand this is needed to map the input to a name you can retrieve later
我不清楚这种区别。在实践中(请参见colab),同一个字典可以同时传递给features
和{
从@estimator_export('estimator.export.ServingInputReceiver')
的source code(或ServingInputReceiver docs:
- features: A
Tensor
,SparseTensor
, or dict of string toTensor
orSparseTensor
, specifying the features to be passed to the model. Note: iffeatures
passed is not a dict, it will be wrapped in a dict with a single entry, using 'feature' as the key. Consequently, the model must accept a feature dict of the form {'feature': tensor}. You may useTensorServingInputReceiver
if you want the tensor to be passed as is.- receiver_tensors: A
Tensor
,SparseTensor
, or dict of string toTensor
orSparseTensor
, specifying input nodes where this receiver expects to be fed by default. Typically, this is a single placeholder expecting serializedtf.Example
protos.
读完后,我就明白了features
的目的。features
是一个输入字典,然后通过图形发送。许多常见的模型只有一个输入,但您可以或当然有更多。在
因此,关于receiver_tensors
的语句“通常,这是一个预期序列化的tf.Example
protos的单个占位符。”对我来说,建议receiver_tensors
为从TFRecord
s解析的(Sequence)Example
s提供一个单独的批处理占位符
为什么?如果TFRecord
s是完全预处理的,那么这是冗余的吗?如果它不是完全预处理的,为什么要通过它?features
和receiver_tensors
字典中的键应该相同吗?在
有人能不能给我一个更具体的例子来说明两者之间的区别以及目前的情况
^{pr2}$工作。。。(即使可能不应该…)
如果在TensorServingInputReceiver内部做预处理,那么receiver的张量和特性会有所不同。在TensorServingInputReceiver内部进行预处理后,特征将传递给模型。接收器张量是张量服务输入接收器的输入,它们可以在tf.示例格式
服务输入函数的工作是将接收到的原始特征转换为模型函数接受的处理特征。在
receiver_tensors
:这些是输入占位符。这是在您的图形中打开的,您将在其中接收原始输入特性。在定义了这个占位符之后,您可以对这些接收张量执行转换,以将它们转换为模型可接受的特性。这些转变包括:
features
:转换接收张量后,将获得在预测过程中直接输入模型函数的特征。在您的情况下,提供给服务输入函数的数据不需要预处理。因此
features = receiver_tensors
正在工作。在据我所知,斯瓦普尼尔的回答是正确的。 我想分享我的一个例子。在
假设graph的输入是shape的占位符[None,64]
但是我们从上游得到的是32个浮点数的数组,我们需要将它们处理成形状[None,64],例如,简单地重复它们。在
^{pr2}$当然,我们可以在外部完成这个过程,并像定义图的输入一样,提供估计器数据。在本例中,我们将上游进程中的输入串联起来,原始输入将是[None,64] 所以函数应该是
相关问题 更多 >
编程相关推荐