我安装了openVINO的英特尔分布,在神经计算棒v2上运行推理。这是成功的,我们的家庭培训的TensorFlow SSD模型。然而,棍子没有处理任何我们更快的R架构。为了解决这个问题,我尝试使用windows 10安装中包含的模型优化器将TensorFlow1.13更快的R resnet101模型转换为OpenVINO框架。在转换过程中,我得到了以下错误:
[ ERROR ] -------------------------------------------------
[ ERROR ] ----------------- INTERNAL ERROR ----------------
[ ERROR ] Unexpected exception happened.
[ ERROR ] Please contact Model Optimizer developers and forward the following information:
[ ERROR ] [Errno 13] Permission denied: 'C:\\Program Files (x86)\\IntelSWTools\\openvino\\deployment_tools\\model_optimizer\\.\\frozen_inference_graph.bin'
[ ERROR ] Traceback (most recent call last):
File "C:\Program Files (x86)\IntelSWTools\openvino\deployment_tools\model_optimizer\mo\main.py", line 309, in main
ret_code = driver(argv)
File "C:\Program Files (x86)\IntelSWTools\openvino\deployment_tools\model_optimizer\mo\main.py", line 270, in driver
ret_res = emit_ir(prepare_ir(argv), argv)
File "C:\Program Files (x86)\IntelSWTools\openvino\deployment_tools\model_optimizer\mo\main.py", line 254, in emit_ir
meta_info=get_meta_info(argv))
File "C:\Program Files (x86)\IntelSWTools\openvino\deployment_tools\model_optimizer\mo\pipeline\common.py", line 223, in prepare_emit_ir
serialize_constants(graph, bin_file)
File "C:\Program Files (x86)\IntelSWTools\openvino\deployment_tools\model_optimizer\mo\back\ie_ir_ver_2\emitter.py", line 43, in serialize_constants
with open(bin_file_name, 'wb') as bin_file:
PermissionError: [Errno 13] Permission denied: 'C:\\Program Files (x86)\\IntelSWTools\\openvino\\deployment_tools\\model_optimizer\\.\\frozen_inference_graph.bin'
[ ERROR ] ---------------- END OF BUG REPORT --------------
[ ERROR ] -------------------------------------------------有人能帮我吗?我们希望在英特尔的神经计算棒v2上运行更快的run。
发布于 2020-08-07 17:52:43
这是要转换模型的目录的写入权限问题。由于您正在使用Windows,我向您建议两种方法:
打开命令提示符,并以Administrator.
https://stackoverflow.com/questions/63168524
复制相似问题