Removed inference_engine, ngraph, transformations libraries (#9309)

This commit is contained in:
Ilya Churaev 2021-12-21 05:27:53 +03:00 committed by GitHub
parent 3c93c3e766
commit b5238e55e1
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
15 changed files with 31 additions and 49 deletions

View File

@ -23,11 +23,11 @@ Inference Engine uses a plugin architecture. Inference Engine plugin is a softwa
Your application must link to the core Inference Engine libraries:
* Linux* OS:
- `libinference_engine.so`, which depends on `libinference_engine_transformations.so`, `libtbb.so`, `libtbbmalloc.so` and `libngraph.so`
- `libov_runtime.so`, which depends on `libtbb.so`, `libtbbmalloc.so`
* Windows* OS:
- `inference_engine.dll`, which depends on `inference_engine_transformations.dll`, `tbb.dll`, `tbbmalloc.dll` and `ngraph.dll`
- `ov_runtime.dll`, which depends on `tbb.dll`, `tbbmalloc.dll`
* macOS*:
- `libinference_engine.dylib`, which depends on `libinference_engine_transformations.dylib`, `libtbb.dylib`, `libtbbmalloc.dylib` and `libngraph.dylib`
- `libov_runtime.dylib`, which depends on `libtbb.dylib`, `libtbbmalloc.dylib`
The required C++ header files are located in the `include` directory.
@ -65,8 +65,8 @@ The table below shows the plugin libraries and additional dependencies for Linux
| Plugin | Library name for Linux | Dependency libraries for Linux | Library name for Windows | Dependency libraries for Windows | Library name for macOS | Dependency libraries for macOS |
|--------|-----------------------------|-------------------------------------------------------------|--------------------------|--------------------------------------------------------------------------------------------------------|------------------------------|---------------------------------------------|
| CPU | `libMKLDNNPlugin.so` | `libinference_engine_lp_transformations.so` | `MKLDNNPlugin.dll` | `inference_engine_lp_transformations.dll` | `libMKLDNNPlugin.so` | `inference_engine_lp_transformations.dylib` |
| GPU | `libov_intel_gpu_plugin.so` | `libinference_engine_lp_transformations.so`, `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll`, `inference_engine_lp_transformations.dll` | Is not supported | - |
| CPU | `libMKLDNNPlugin.so` | | `MKLDNNPlugin.dll` | | `libMKLDNNPlugin.so` | |
| GPU | `libov_intel_gpu_plugin.so` | `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll` | Is not supported | - |
| MYRIAD | `libmyriadPlugin.so` | `libusb.so`, | `myriadPlugin.dll` | `usb.dll` | `libmyriadPlugin.so` | `libusb.dylib` |
| HDDL | `libHDDLPlugin.so` | `libbsl.so`, `libhddlapi.so`, `libmvnc-hddl.so` | `HDDLPlugin.dll` | `bsl.dll`, `hddlapi.dll`, `json-c.dll`, `libcrypto-1_1-x64.dll`, `libssl-1_1-x64.dll`, `mvnc-hddl.dll` | Is not supported | - |
| GNA | `libov_intel_gna_plugin.so` | `libgna.so`, | `ov_intel_gna_plugin.dll` | `gna.dll` | Is not supported | - |

View File

@ -7,7 +7,7 @@ for example of using the Inference Engine in applications.
## Use the Inference Engine API in Your Code
The core `libinference_engine.so` library implements loading and parsing a model Intermediate Representation (IR), and triggers inference using a specified device. The core library has the following API:
The core `libov_runtime.so` library implements loading and parsing a model Intermediate Representation (IR), and triggers inference using a specified device. The core library has the following API:
* `InferenceEngine::Core`
* `InferenceEngine::Blob`, `InferenceEngine::TBlob`,

View File

@ -28,11 +28,11 @@ Modules in the Inference Engine component
Your application must link to the core Inference Engine libraries:
* Linux* OS:
- `libinference_engine.so`, which depends on `libinference_engine_transformations.so`, `libtbb.so`, `libtbbmalloc.so` and `libngraph.so`
- `libov_runtime.so`, which depends on `libtbb.so`, `libtbbmalloc.so`
* Windows* OS:
- `inference_engine.dll`, which depends on `inference_engine_transformations.dll`, `tbb.dll`, `tbbmalloc.dll` and `ngraph.dll`
- `ov_runtime.dll`, which depends on `tbb.dll`, `tbbmalloc.dll`
* macOS*:
- `libinference_engine.dylib`, which depends on `libinference_engine_transformations.dylib`, `libtbb.dylib`, `libtbbmalloc.dylib` and `libngraph.dylib`
- `libov_runtime.dylib`, which depends on `libtbb.dylib`, `libtbbmalloc.dylib`
The required C++ header files are located in the `include` directory.
@ -70,8 +70,8 @@ The table below shows the plugin libraries and additional dependencies for Linux
| Plugin | Library name for Linux | Dependency libraries for Linux | Library name for Windows | Dependency libraries for Windows | Library name for macOS | Dependency libraries for macOS |
|--------|-----------------------------|-------------------------------------------------------------|--------------------------|--------------------------------------------------------------------------------------------------------|------------------------------|---------------------------------------------|
| CPU | `libMKLDNNPlugin.so` | `libinference_engine_lp_transformations.so` | `MKLDNNPlugin.dll` | `inference_engine_lp_transformations.dll` | `libMKLDNNPlugin.so` | `inference_engine_lp_transformations.dylib` |
| GPU | `libov_intel_gpu_plugin.so` | `libinference_engine_lp_transformations.so`, `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll`, `inference_engine_lp_transformations.dll` | Is not supported | - |
| CPU | `libMKLDNNPlugin.so` | | `MKLDNNPlugin.dll` | | `libMKLDNNPlugin.so` | |
| GPU | `libov_intel_gpu_plugin.so` | `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll` | Is not supported | - |
| MYRIAD | `libmyriadPlugin.so` | `libusb.so`, | `myriadPlugin.dll` | `usb.dll` | `libmyriadPlugin.so` | `libusb.dylib` |
| HDDL | `libHDDLPlugin.so` | `libbsl.so`, `libhddlapi.so`, `libmvnc-hddl.so` | `HDDLPlugin.dll` | `bsl.dll`, `hddlapi.dll`, `json-c.dll`, `libcrypto-1_1-x64.dll`, `libssl-1_1-x64.dll`, `mvnc-hddl.dll` | Is not supported | - |
| GNA | `libov_intel_gna_plugin.so` | `libgna.so`, | `ov_intel_gna_plugin.dll` | `gna.dll` | Is not supported | - |

View File

@ -69,7 +69,7 @@ PREDEFINED = "INFERENCE_ENGINE_API=" \
"OPENVINO_CORE_EXPORTS=" \
"INFERENCE_ENGINE_DEPRECATED=" \
"OPENVINO_DEPRECATED=" \
"inference_engine_transformations_EXPORTS" \
"IMPLEMENT_OPENVINO_API" \
"TRANSFORMATIONS_API=" \
"NGRAPH_HELPER_DLL_EXPORT=" \
"NGRAPH_HELPER_DLL_IMPORT=" \

View File

@ -10,8 +10,7 @@ Before creating a transformation, do the following:
* Understand where to put your transformation code
### Transformation Library Structure
Transformation library is independent from Inference Engine target library named as `inference_engine_transformations`
and is located in the `src/common/transformations` directory.
OpenVINO transformations are located in the `src/common/transformations` directory.
Transformations root directory contains two folders:
* `ngraph_ops` - Contains internal opset operations that are common for plugins.

View File

@ -1,6 +1,6 @@
# Copyright (C) 2018-2021 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
# ngraph.dll directory path visibility is needed to use _pyngraph module
# ov_runtime.dll directory path visibility is needed to use _pyngraph module
# import below causes adding this path to os.environ["PATH"]
import openvino # noqa: F401 'imported but unused'

View File

@ -1,6 +1,6 @@
# Copyright (C) 2018-2021 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
# ngraph.dll directory path visibility is needed to use _pyngraph module
# ov_runtime.dll directory path visibility is needed to use _pyngraph module
# import below causes adding this path to os.environ["PATH"]
import ngraph # noqa: F401 'imported but unused'

View File

@ -18,5 +18,5 @@
# define LP_TRANSFORMATIONS_API OPENVINO_CORE_EXPORTS
# else
# define LP_TRANSFORMATIONS_API OPENVINO_CORE_IMPORTS
# endif // inference_engine_lp_transformations_EXPORTS
# endif // IMPLEMENT_OPENVINO_API
#endif // OPENVINO_STATIC_LIBRARY

View File

@ -37,5 +37,5 @@
# define TRANSFORMATIONS_API OPENVINO_CORE_EXPORTS
# else
# define TRANSFORMATIONS_API OPENVINO_CORE_IMPORTS
#endif // inference_engine_transformations_EXPORTS
# endif // IMPLEMENT_OPENVINO_API
#endif // OPENVINO_STATIC_LIBRARY

View File

@ -125,7 +125,7 @@ def test_verify(test_id, prepared_models, openvino_ref, artifacts, tolerance=1e-
@pytest.mark.dependency(depends=["cc_collect", "minimized_pkg"])
def test_libs_size(test_id, models, openvino_ref, artifacts): # pylint: disable=unused-argument
"""Test if libraries haven't increased in size after conditional compilation."""
libraries = ["inference_engine_transformations", "MKLDNNPlugin", "ngraph"]
libraries = ["ov_runtime", "MKLDNNPlugin"]
minimized_pkg = artifacts / test_id / "install_pkg"
ref_libs_size = get_lib_sizes(openvino_ref, libraries)
lib_sizes = get_lib_sizes(minimized_pkg, libraries)

View File

@ -75,7 +75,7 @@ To build coverage report after fuzz test execution run:
```
llvm-profdata merge -sparse *.profraw -o default.profdata && \
llvm-cov show ./read_network-fuzzer -object=lib/libinference_engine.so -instr-profile=default.profdata -format=html -output-dir=read_network-coverage
llvm-cov show ./read_network-fuzzer -object=lib/libov_runtime.so -instr-profile=default.profdata -format=html -output-dir=read_network-coverage
```
## Reproducing findings

View File

@ -36,15 +36,12 @@ def get_lib_path(lib_name):
"""Function for getting absolute path in OpenVINO directory to specific lib"""
os_name = get_os_name()
all_libs = {
'inference_engine_transformations': {
'Windows': Path('runtime/bin/intel64/Release/inference_engine_transformations.dll'),
'Linux': Path('runtime/lib/intel64/libinference_engine_transformations.so')},
'MKLDNNPlugin': {
'Windows': Path('runtime/bin/intel64/Release/MKLDNNPlugin.dll'),
'Linux': Path('runtime/lib/intel64/libMKLDNNPlugin.so')},
'ngraph': {
'Windows': Path('runtime/bin/intel64/Release/ngraph.dll'),
'Linux': Path('runtime/lib/intel64/libngraph.so')}
'ov_runtime': {
'Windows': Path('runtime/bin/intel64/Release/ov_runtime.dll'),
'Linux': Path('runtime/lib/intel64/libov_runtime.so')}
}
return all_libs[lib_name][os_name]

View File

@ -16,14 +16,11 @@
"ie_core": {
"group": ["ie"],
"files": [
"runtime/lib/intel64/libinference_engine.dylib",
"runtime/lib/intel64/libinference_engine_transformations.dylib",
"runtime/lib/intel64/libov_runtime.dylib",
"runtime/lib/intel64/libinference_engine_preproc.so",
"runtime/lib/intel64/libinference_engine_c_api.dylib",
"runtime/lib/intel64/libov_hetero_plugin.so",
"runtime/lib/intel64/libov_auto_plugin.so",
"runtime/lib/intel64/libngraph.dylib",
"runtime/lib/intel64/libfrontend_common.dylib",
"runtime/lib/intel64/libov_ir_frontend.dylib",
"runtime/lib/intel64/libov_onnx_frontend.dylib",
"runtime/lib/intel64/libov_paddlepaddle_frontend.dylib",
@ -36,7 +33,6 @@
"group": ["ie"],
"dependencies" : ["ie_core"],
"files": [
"runtime/lib/intel64/libinference_engine_lp_transformations.dylib",
"runtime/lib/intel64/libMKLDNNPlugin.so"
]
},

View File

@ -22,14 +22,11 @@
"ie_core": {
"group": ["ie"],
"files": [
"runtime/lib/intel64/libinference_engine.so",
"runtime/lib/intel64/libinference_engine_transformations.so",
"runtime/lib/intel64/libov_runtime.so",
"runtime/lib/intel64/libinference_engine_preproc.so",
"runtime/lib/intel64/libinference_engine_c_api.so",
"runtime/lib/intel64/libov_hetero_plugin.so",
"runtime/lib/intel64/libov_auto_plugin.so",
"runtime/lib/intel64/libngraph.so",
"runtime/lib/intel64/libfrontend_common.so",
"runtime/lib/intel64/libov_ir_frontend.so",
"runtime/lib/intel64/libov_onnx_frontend.so",
"runtime/lib/intel64/libov_paddlepaddle_frontend.so",
@ -42,7 +39,6 @@
"group": ["ie"],
"dependencies" : ["ie_core"],
"files": [
"runtime/lib/intel64/libinference_engine_lp_transformations.so",
"runtime/lib/intel64/libMKLDNNPlugin.so"
]
},
@ -53,7 +49,6 @@
"files": [
"runtime/lib/intel64/cache.json",
"runtime/lib/intel64/libov_intel_gpu_plugin.so",
"runtime/lib/intel64/libinference_engine_lp_transformations.so",
"install_dependencies/install_NEO_OCL_driver.sh"
]
},

View File

@ -16,14 +16,11 @@
"ie_core": {
"group": ["ie"],
"files": [
"runtime/bin/intel64/Release/inference_engine.dll",
"runtime/bin/intel64/Release/inference_engine_transformations.dll",
"runtime/bin/intel64/Release/ov_runtime.dll",
"runtime/bin/intel64/Release/inference_engine_preproc.dll",
"runtime/bin/intel64/Release/inference_engine_c_api.dll",
"runtime/bin/intel64/Release/ov_hetero_plugin.dll",
"runtime/bin/intel64/Release/ov_auto_plugin.dll",
"runtime/bin/intel64/Release/ngraph.dll",
"runtime/bin/intel64/Release/frontend_common.dll",
"runtime/bin/intel64/Release/ov_ir_frontend.dll",
"runtime/bin/intel64/Release/ov_onnx_frontend.dll",
"runtime/bin/intel64/Release/ov_paddlepaddle_frontend.dll",
@ -36,7 +33,6 @@
"group": ["ie"],
"dependencies" : ["ie_core"],
"files": [
"runtime/bin/intel64/Release/inference_engine_lp_transformations.dll",
"runtime/bin/intel64/Release/MKLDNNPlugin.dll"
]
},
@ -46,7 +42,6 @@
"dependencies" : ["ie_core"],
"files": [
"runtime/bin/intel64/Release/cache.json",
"runtime/bin/intel64/Release/inference_engine_lp_transformations.dll",
"runtime/bin/intel64/Release/ov_intel_gpu_plugin.dll"
]
},