首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >将图形原型(pb/pbtxt)转换为SavedModel,以便在TensorFlow服务或云ML引擎中使用

将图形原型(pb/pbtxt)转换为SavedModel,以便在TensorFlow服务或云ML引擎中使用
EN

Stack Overflow用户
提问于 2017-06-02 20:38:38
回答 2查看 18.5K关注 0票数 10

我一直在我训练的一个模型上遵循TensorFlow for Poets 2代码实验室,并创建了一个带有嵌入权重的冻结量化图。它被捕获在一个单独的文件中--比如my_quant_graph.pb

由于我可以很好地使用该图与TensorFlow Android inference library进行推理,我认为我可以使用云ML引擎来做同样的事情,但它似乎只能在SavedModel模型上工作。

如何简单地转换单个pb文件中的冻结/量化图形,以便在ML engine上使用?

EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2017-06-02 20:39:25

事实证明,SavedModel提供了有关已保存图形的一些额外信息。假设冻结的图不需要资产,那么它只需要指定一个服务签名。

下面是我运行的python代码,用于将我的图形转换为Cloud ML引擎接受的格式。注意,我只有一对输入/输出张量。

代码语言:javascript
复制
import tensorflow as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants

export_dir = './saved'
graph_pb = 'my_quant_graph.pb'

builder = tf.saved_model.builder.SavedModelBuilder(export_dir)

with tf.gfile.GFile(graph_pb, "rb") as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())

sigs = {}

with tf.Session(graph=tf.Graph()) as sess:
    # name="" is important to ensure we don't get spurious prefixing
    tf.import_graph_def(graph_def, name="")
    g = tf.get_default_graph()
    inp = g.get_tensor_by_name("real_A_and_B_images:0")
    out = g.get_tensor_by_name("generator/Tanh:0")

    sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
        tf.saved_model.signature_def_utils.predict_signature_def(
            {"in": inp}, {"out": out})

    builder.add_meta_graph_and_variables(sess,
                                         [tag_constants.SERVING],
                                         signature_def_map=sigs)

builder.save()
票数 23
EN

Stack Overflow用户

发布于 2021-04-10 04:47:09

有多个outputs节点的示例:

代码语言:javascript
复制
# Convert PtotoBuf model to saved_model, format for TF Serving
# https://cloud.google.com/ai-platform/prediction/docs/exporting-savedmodel-for-prediction
import shutil
import tensorflow.compat.v1 as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants

export_dir = './1' # TF Serving supports run different versions of same model. So we put current model to '1' folder.
graph_pb = 'frozen_inference_graph.pb'

# Clear out folder
shutil.rmtree(export_dir, ignore_errors=True)

builder = tf.saved_model.builder.SavedModelBuilder(export_dir)

with tf.io.gfile.GFile(graph_pb, "rb") as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())

sigs = {}

with tf.Session(graph=tf.Graph()) as sess:
    # Prepare input and outputs of model
    tf.import_graph_def(graph_def, name="")
    g = tf.get_default_graph()
    image_tensor = g.get_tensor_by_name("image_tensor:0")
    num_detections = g.get_tensor_by_name("num_detections:0")
    detection_scores = g.get_tensor_by_name("detection_scores:0")
    detection_boxes = g.get_tensor_by_name("detection_boxes:0")
    detection_classes = g.get_tensor_by_name("detection_classes:0")

    sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
        tf.saved_model.signature_def_utils.predict_signature_def(
            {"input_image": image_tensor}, 
            {   "num_detections": num_detections,
                "detection_scores": detection_scores, 
                "detection_boxes": detection_boxes, 
                "detection_classes": detection_classes})

    builder.add_meta_graph_and_variables(sess,
                                         [tag_constants.SERVING],
                                         signature_def_map=sigs)

builder.save()
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/44329185

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档