Build IR FE with plugins (#7593)

* Build IR FE with plugins

* Add paddlepaddle
This commit is contained in:
Ilya Lavrenov 2021-09-23 10:41:33 +03:00 committed by GitHub
parent e253b5931c
commit bd29f64570
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 14 additions and 10 deletions

View File

@ -89,8 +89,8 @@ function(ie_add_plugin)
# fake dependencies to build in the following order:
# IE -> IE readers -> IE inference plugins -> IE-based apps
if(TARGET inference_engine_ir_reader)
add_dependencies(${IE_PLUGIN_NAME} inference_engine_ir_reader)
if(TARGET ir_ngraph_frontend)
add_dependencies(${IE_PLUGIN_NAME} ir_ngraph_frontend)
endif()
if(TARGET inference_engine_ir_v7_reader)
add_dependencies(${IE_PLUGIN_NAME} inference_engine_ir_v7_reader)

View File

@ -38,12 +38,14 @@ This library contains the classes to:
### Plugin Libraries to Read a Network Object
Starting from 2020.4 release, Inference Engine introduced a concept of `CNNNetwork` reader plugins. Such plugins can be automatically dynamically loaded by Inference Engine in runtime depending on file format:
Starting from 2022.1 release, OpenVINO Runtime introduced a concept of frontend plugins. Such plugins can be automatically dynamically loaded by OpenVINO Runtime dynamically depending on file format:
* Linux* OS:
- `libinference_engine_ir_reader.so` to read a network from IR
- `onnx_ngraph_frontend.so` to read a network from ONNX model format
- `libir_ngraph_frontend.so` to read a network from IR
- `libpaddlepaddle_ngraph_frontend.so` to read a network from PaddlePaddle model format
- `libonnx_ngraph_frontend.so` to read a network from ONNX model format
* Windows* OS:
- `inference_engine_ir_reader.dll` to read a network from IR
- `ir_ngraph_frontend.dll` to read a network from IR
- `paddlepaddle_ngraph_frontend.dll` to read a network from PaddlePaddle model format
- `onnx_ngraph_frontend.dll` to read a network from ONNX model format
### Device-Specific Plugin Libraries

View File

@ -43,12 +43,14 @@ This library contains the classes to:
### Plugin Libraries to read a network object ###
Starting from 2020.4 release, Inference Engine introduced a concept of `CNNNetwork` reader plugins. Such plugins can be automatically dynamically loaded by Inference Engine in runtime depending on file format:
Starting from 2022.1 release, OpenVINO Runtime introduced a concept of frontend plugins. Such plugins can be automatically dynamically loaded by OpenVINO Runtime dynamically depending on file format:
* Unix* OS:
- `libinference_engine_ir_reader.so` to read a network from IR
- `onnx_ngraph_frontend.so` to read a network from ONNX model format
- `libir_ngraph_frontend.so` to read a network from IR
- `libpaddlepaddle_ngraph_frontend.so` to read a network from PaddlePaddle model format
- `libonnx_ngraph_frontend.so` to read a network from ONNX model format
* Windows* OS:
- `inference_engine_ir_reader.dll` to read a network from IR
- `ir_ngraph_frontend.dll` to read a network from IR
- `paddlepaddle_ngraph_frontend.dll` to read a network from PaddlePaddle model format
- `onnx_ngraph_frontend.dll` to read a network from ONNX model format
### Device-specific Plugin Libraries ###