如何发送tf.示例为gRPC预测需求服务的TensorFlow

2024-04-26 14:07:14 发布

您现在位置:Python中文网/ 问答频道 /正文

我有数据在里面tf.示例form和am尝试以predict form(使用gRPC)向保存的模型发出请求。我无法识别实现此目的的方法调用。在

我从著名的汽车定价DNN回归模型(https://github.com/tensorflow/models/blob/master/samples/cookbook/regression/dnn_regression.py)开始,我已经通过TF服务docker容器导出并安装了该模型

import grpc
import numpy as np
import tensorflow as tf
from tensorflow_serving.apis import predict_pb2, prediction_service_pb2_grpc

stub = prediction_service_pb2_grpc.PredictionServiceStub(grpc.insecure_channel("localhost:8500"))

tf_ex = tf.train.Example(
    features=tf.train.Features(
        feature={
            'curb-weight': tf.train.Feature(float_list=tf.train.FloatList(value=[5.1])),
            'highway-mpg': tf.train.Feature(float_list=tf.train.FloatList(value=[3.3])),
            'body-style': tf.train.Feature(bytes_list=tf.train.BytesList(value=[b"wagon"])),
            'make': tf.train.Feature(bytes_list=tf.train.BytesList(value=[b"Honda"])),
        }
    )
)

request = predict_pb2.PredictRequest()
request.model_spec.name = "regressor_test"

# Tried this:
request.inputs['inputs'].CopyFrom(tf_ex)

# Also tried this:
request.inputs['inputs'].CopyFrom(tf.contrib.util.make_tensor_proto(tf_ex))

# This doesn't work either:
request.input.example_list.examples.extend(tf_ex)

# If it did work, I would like to inference on it like this:
result = self.stub.Predict(request, 10.0)

谢谢你的建议


Tags: 模型importgrpcvaluerequesttftensorflowtrain
2条回答

@哈库米的回答对我没用。但当我把最后一行修改为

request.inputs['inputs'].CopyFrom(tf.make_tensor_proto([tf_ex.SerializeToString()], dtype=types_pb2.DT_STRING),shape=[1])

它起作用了。如果“shape”为None,则得到的张量proto精确地表示numpy数组。enter link description here

我假设您的savedModel有一个serving_input_receiver_fn,将string作为输入并解析到tf.ExampleUsing SavedModel with Estimators

def serving_example_input_receiver_fn():
    serialized_tf_example = tf.placeholder(dtype=tf.string)
    receiver_tensors = {'inputs': serialized_tf_example}   
    features = tf.parse_example(serialized_tf_example, YOUR_EXAMPLE_SCHEMA)
    return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

所以,serving_input_receiver_fn接受一个字符串,所以你必须SerializeToString你的tf.Example()。此外,serving_input_receiver_fn的工作方式与input_fn类似,可以批量将数据转储到模型中。在

代码可能更改为:

^{pr2}$

相关问题 更多 >