我已经成功地训练和本地预测的DNNLinearCombinedClassifier从人工智能平台样本模板.
当我在本地PC上运行pip freeze| grep tensorflow时:
tensorflow==1.15.0
tensorflow-datasets==1.2.0
tensorflow-estimator==1.15.1
tensorflow-hub==0.6.0
tensorflow-io==0.8.0
tensorflow-metadata==0.15.1
tensorflow-model-analysis==0.15.4
tensorflow-probability==0.8.0
tensorflow-serving-api==1.15.0当我为我保存的模型运行saved_model_cli show时,我得到了以下输出:
The given SavedModel SignatureDef contains the following input(s):
inputs['Sector'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder_2:0
inputs['announcement_type_simple'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder_1:0
inputs['market_cap'] tensor_info:
dtype: DT_FLOAT
shape: (-1)
name: Placeholder_3:0
inputs['sens_content'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
outputs['all_class_ids'] tensor_info:
dtype: DT_INT32
shape: (-1, 3)
name: head/predictions/Tile:0
outputs['all_classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 3)
name: head/predictions/Tile_1:0
outputs['class_ids'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: head/predictions/ExpandDims_2:0
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 1)
name: head/predictions/str_classes:0
outputs['logits'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/logits/BiasAdd:0
outputs['probabilities'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: head/predictions/probabilities:0
Method name is: tensorflow/serving/predict这些输入与我输入到json文件中的内容是一致的,如下所示:
{"sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group", "announcement_type_simple": "trade statement", "Sector": "Consumer, Non-cyclical","market_cap": 4377615219.88}模型用gcloud ai-platform local predict推断。
当我运行gcloud ai-platform predict --model=${MODEL_NAME} --version=${MODEL_VERSION} --json-instances=data/new-data.json --verbosity debug --log-http时,它会创建以下帖子:
==== request start ====
uri: https://ml.googleapis.com/v1/projects/simon-teraflow-project/models/tensorflow_sens1/versions/v3:predict
method: POST
== headers start ==
Authorization: --- Token Redacted ---
Content-Type: application/json
user-agent: gcloud/270.0.0 command/gcloud.ai-platform.predict invocation-id/f01f2f4b8c494082abfc38e19499019b environment/GCE environment-version/None interactive/True from-script/False python/2.7.13 term/xterm (Linux 4.9.0-11-amd64)
== headers end ==
== body start ==
{"instances": [{"Sector": "Consumer, Non-cyclical", "announcement_type_simple": "trade statement", "market_cap": 4377615219.88, "sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group"}]}
== body end ==
==== request end ====您可以看到输入与所需的内容是一致的。以下是答复:
Traceback (most recent call last):
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 984, in Execute
resources = calliope_command.Run(cli=self, args=args)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 798, in Run
resources = command_instance.Run(args)
File "/usr/lib/google-cloud-sdk/lib/surface/ai_platform/predict.py", line 110, in Run
signature_name=args.signature_name)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/ml_engine/predict.py", line 77, in Predict
response_body)
HttpRequestFailError: HTTP request failed. Response: {
"error": {
"code": 400,
"message": "Bad Request",
"status": "INVALID_ARGUMENT"
}
}
ERROR: (gcloud.ai-platform.predict) HTTP request failed. Response: {
"error": {
"code": 400,
"message": "Bad Request",
"status": "INVALID_ARGUMENT"
}
} 在人工智能平台上尝试了同样的方法“测试你的模型”。同样的结果:
人工智能平台gui上的预测

我已经检查了运行时是1.15,它与本地预测一致,也是一致的python版本。
我找过一个类似的案子,但什么也没找到。如有任何建议,将不胜感激。
发布于 2019-12-27 00:30:07
您可以尝试以下方法:
1)将模型保存在本地,您可以使用适合您的模式的以下代码段1示例
2)使用Docker进行测试
3)将模型部署到GCP中,并向模型2发出请求,使用gcloud命令代替GCP UI。
1
========Code snippet===============
MODEL_NAME = <MODEL NAME>
VERSION = <MODEL VERSION>
SERVE_PATH = './models/{}/{}'.format(MODEL_NAME, VERSION)
import tensorflow as tf
import tensorflow_hub as hub
use_model = "https://tfhub.dev/google/<MODEL NAME>/<MODEL VERSION>"
with tf.Graph().as_default():
module = hub.Module(use_model, name=MODEL_NAME)
text = tf.placeholder(tf.string, [None])
embedding = module(text)
init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])
with tf.Session() as session:
session.run(init_op)
tf.saved_model.simple_save(
session,
SERVE_PATH,
inputs = {"text": text},
outputs = {"embedding": embedding},
legacy_init_op = tf.tables_initializer()
)
========/ Code snippet===============2
Replace <Project_name>, <model_name>, <bucket_name> and <model_version>
$ gcloud ai-platform models create <model_name> --project <Project_name>
$ gcloud beta ai-platform versions create v1 --project <Project_name> --model <model_name> --origin=/location/of/model/dir/<model_name>/<model_version> --staging-bucket gs://<bucket_name> --runtime-version=1.15 --machine-type=n1-standard-8
$ echo '{"text": "cat"}' > instances.json
$ gcloud ai-platform predict --project <Project_name> --model <model_name> --version v1 --json-instances=instances.json
$ curl -X POST -v -k -H "Content-Type: application/json" -d '{"instances": [{"text": "cat"}]}' -H "Authorization: Bearer `gcloud auth print-access-token`" "https://ml.googleapis.com/v1/projects/<Project_name>/models/<model_name>/versions/v1:predict"https://stackoverflow.com/questions/59467089
复制相似问题