我冻结并导出了一个SavedModel,它接受一批符合saved_model_cli格式的视频作为输入:
The given SavedModel SignatureDef contains the following input(s):
inputs['ims_ph'] tensor_info:
dtype: DT_UINT8
shape: (1, 248, 224, 224, 3)
name: Placeholder:0
inputs['samples_ph'] tensor_info:
dtype: DT_FLOAT
shape: (1, 173774, 2)
name: Placeholder_1:0
The given SavedModel SignatureDef contains the following output(s):
... << OUTPUTS >> ......
Method name is: tensorflow/serving/predict我有一个TF服务(HTTP/REST)服务器在本地成功运行。在我的Python客户机代码中,我有两个填充的numpy.ndarray类型的对象,名为ims of shape (1,248,224,224,3)和samples of shape (1,173774,2)。
我正在尝试对TF模型服务器运行推理(请参阅下面的客户端代码),但收到以下错误:{u'error': u'JSON Parse error: Invalid value. at offset: 0'}
# I have tried the following combinations without success:
data = {"instances" : [{"ims_ph": ims.tolist()}, {"samples_ph": samples.tolist()} ]}
data = {"inputs" : { "ims_ph": ims, "samples_ph": samples} }
r = requests.post(url="http://localhost:9000/v1/models/multisensory:predict", data=data)TF-Serving REST docs似乎并不表示这两个输入张量需要任何额外的转义/编码。因为这些不是二进制数据,所以我也不认为base64编码是正确的方法。任何指向这里的工作方法的建议都将不胜感激!
发布于 2020-01-03 15:01:10
你应该像这样发送你的请求,首先是json序列化请求主体。
r = requests.post(url="http://localhost:9000/v1/models/multisensory:predict", data=json.dumps(data))https://stackoverflow.com/questions/59574255
复制相似问题