diff --git a/docs/IE_DG/Deep_Learning_Inference_Engine_DevGuide.md b/docs/IE_DG/Deep_Learning_Inference_Engine_DevGuide.md index 8297a16d1f5..9381122313d 100644 --- a/docs/IE_DG/Deep_Learning_Inference_Engine_DevGuide.md +++ b/docs/IE_DG/Deep_Learning_Inference_Engine_DevGuide.md @@ -23,11 +23,11 @@ Inference Engine uses a plugin architecture. Inference Engine plugin is a softwa Your application must link to the core Inference Engine libraries: * Linux* OS: - - `libinference_engine.so`, which depends on `libinference_engine_transformations.so`, `libtbb.so`, `libtbbmalloc.so` and `libngraph.so` + - `libov_runtime.so`, which depends on `libtbb.so`, `libtbbmalloc.so` * Windows* OS: - - `inference_engine.dll`, which depends on `inference_engine_transformations.dll`, `tbb.dll`, `tbbmalloc.dll` and `ngraph.dll` + - `ov_runtime.dll`, which depends on `tbb.dll`, `tbbmalloc.dll` * macOS*: - - `libinference_engine.dylib`, which depends on `libinference_engine_transformations.dylib`, `libtbb.dylib`, `libtbbmalloc.dylib` and `libngraph.dylib` + - `libov_runtime.dylib`, which depends on `libtbb.dylib`, `libtbbmalloc.dylib` The required C++ header files are located in the `include` directory. @@ -65,8 +65,8 @@ The table below shows the plugin libraries and additional dependencies for Linux | Plugin | Library name for Linux | Dependency libraries for Linux | Library name for Windows | Dependency libraries for Windows | Library name for macOS | Dependency libraries for macOS | |--------|-----------------------------|-------------------------------------------------------------|--------------------------|--------------------------------------------------------------------------------------------------------|------------------------------|---------------------------------------------| -| CPU | `libMKLDNNPlugin.so` | `libinference_engine_lp_transformations.so` | `MKLDNNPlugin.dll` | `inference_engine_lp_transformations.dll` | `libMKLDNNPlugin.so` | `inference_engine_lp_transformations.dylib` | -| GPU | `libov_intel_gpu_plugin.so` | `libinference_engine_lp_transformations.so`, `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll`, `inference_engine_lp_transformations.dll` | Is not supported | - | +| CPU | `libMKLDNNPlugin.so` | | `MKLDNNPlugin.dll` | | `libMKLDNNPlugin.so` | | +| GPU | `libov_intel_gpu_plugin.so` | `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll` | Is not supported | - | | MYRIAD | `libmyriadPlugin.so` | `libusb.so`, | `myriadPlugin.dll` | `usb.dll` | `libmyriadPlugin.so` | `libusb.dylib` | | HDDL | `libHDDLPlugin.so` | `libbsl.so`, `libhddlapi.so`, `libmvnc-hddl.so` | `HDDLPlugin.dll` | `bsl.dll`, `hddlapi.dll`, `json-c.dll`, `libcrypto-1_1-x64.dll`, `libssl-1_1-x64.dll`, `mvnc-hddl.dll` | Is not supported | - | | GNA | `libov_intel_gna_plugin.so` | `libgna.so`, | `ov_intel_gna_plugin.dll` | `gna.dll` | Is not supported | - | diff --git a/docs/IE_DG/Integrate_with_customer_application_new_API.md b/docs/IE_DG/Integrate_with_customer_application_new_API.md index 4d543a9e891..b7d3b69735b 100644 --- a/docs/IE_DG/Integrate_with_customer_application_new_API.md +++ b/docs/IE_DG/Integrate_with_customer_application_new_API.md @@ -7,7 +7,7 @@ for example of using the Inference Engine in applications. ## Use the Inference Engine API in Your Code -The core `libinference_engine.so` library implements loading and parsing a model Intermediate Representation (IR), and triggers inference using a specified device. The core library has the following API: +The core `libov_runtime.so` library implements loading and parsing a model Intermediate Representation (IR), and triggers inference using a specified device. The core library has the following API: * `InferenceEngine::Core` * `InferenceEngine::Blob`, `InferenceEngine::TBlob`, diff --git a/docs/IE_DG/inference_engine_intro.md b/docs/IE_DG/inference_engine_intro.md index f0fae908d51..cc6ae5254c0 100644 --- a/docs/IE_DG/inference_engine_intro.md +++ b/docs/IE_DG/inference_engine_intro.md @@ -28,11 +28,11 @@ Modules in the Inference Engine component Your application must link to the core Inference Engine libraries: * Linux* OS: - - `libinference_engine.so`, which depends on `libinference_engine_transformations.so`, `libtbb.so`, `libtbbmalloc.so` and `libngraph.so` + - `libov_runtime.so`, which depends on `libtbb.so`, `libtbbmalloc.so` * Windows* OS: - - `inference_engine.dll`, which depends on `inference_engine_transformations.dll`, `tbb.dll`, `tbbmalloc.dll` and `ngraph.dll` + - `ov_runtime.dll`, which depends on `tbb.dll`, `tbbmalloc.dll` * macOS*: - - `libinference_engine.dylib`, which depends on `libinference_engine_transformations.dylib`, `libtbb.dylib`, `libtbbmalloc.dylib` and `libngraph.dylib` + - `libov_runtime.dylib`, which depends on `libtbb.dylib`, `libtbbmalloc.dylib` The required C++ header files are located in the `include` directory. @@ -70,8 +70,8 @@ The table below shows the plugin libraries and additional dependencies for Linux | Plugin | Library name for Linux | Dependency libraries for Linux | Library name for Windows | Dependency libraries for Windows | Library name for macOS | Dependency libraries for macOS | |--------|-----------------------------|-------------------------------------------------------------|--------------------------|--------------------------------------------------------------------------------------------------------|------------------------------|---------------------------------------------| -| CPU | `libMKLDNNPlugin.so` | `libinference_engine_lp_transformations.so` | `MKLDNNPlugin.dll` | `inference_engine_lp_transformations.dll` | `libMKLDNNPlugin.so` | `inference_engine_lp_transformations.dylib` | -| GPU | `libov_intel_gpu_plugin.so` | `libinference_engine_lp_transformations.so`, `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll`, `inference_engine_lp_transformations.dll` | Is not supported | - | +| CPU | `libMKLDNNPlugin.so` | | `MKLDNNPlugin.dll` | | `libMKLDNNPlugin.so` | | +| GPU | `libov_intel_gpu_plugin.so` | `libOpenCL.so` | `ov_intel_gpu_plugin.dll` | `OpenCL.dll` | Is not supported | - | | MYRIAD | `libmyriadPlugin.so` | `libusb.so`, | `myriadPlugin.dll` | `usb.dll` | `libmyriadPlugin.so` | `libusb.dylib` | | HDDL | `libHDDLPlugin.so` | `libbsl.so`, `libhddlapi.so`, `libmvnc-hddl.so` | `HDDLPlugin.dll` | `bsl.dll`, `hddlapi.dll`, `json-c.dll`, `libcrypto-1_1-x64.dll`, `libssl-1_1-x64.dll`, `mvnc-hddl.dll` | Is not supported | - | | GNA | `libov_intel_gna_plugin.so` | `libgna.so`, | `ov_intel_gna_plugin.dll` | `gna.dll` | Is not supported | - | diff --git a/docs/doxygen/ie_plugin_api.config b/docs/doxygen/ie_plugin_api.config index b6c72a139e6..4304ffaa9a4 100644 --- a/docs/doxygen/ie_plugin_api.config +++ b/docs/doxygen/ie_plugin_api.config @@ -69,7 +69,7 @@ PREDEFINED = "INFERENCE_ENGINE_API=" \ "OPENVINO_CORE_EXPORTS=" \ "INFERENCE_ENGINE_DEPRECATED=" \ "OPENVINO_DEPRECATED=" \ - "inference_engine_transformations_EXPORTS" \ + "IMPLEMENT_OPENVINO_API" \ "TRANSFORMATIONS_API=" \ "NGRAPH_HELPER_DLL_EXPORT=" \ "NGRAPH_HELPER_DLL_IMPORT=" \ diff --git a/docs/nGraph_DG/nGraphTransformation.md b/docs/nGraph_DG/nGraphTransformation.md index b27a2876b98..2dd3e0d73ad 100644 --- a/docs/nGraph_DG/nGraphTransformation.md +++ b/docs/nGraph_DG/nGraphTransformation.md @@ -10,8 +10,7 @@ Before creating a transformation, do the following: * Understand where to put your transformation code ### Transformation Library Structure -Transformation library is independent from Inference Engine target library named as `inference_engine_transformations` -and is located in the `src/common/transformations` directory. +OpenVINO transformations are located in the `src/common/transformations` directory. Transformations root directory contains two folders: * `ngraph_ops` - Contains internal opset operations that are common for plugins. diff --git a/src/bindings/python/tests/test_ngraph/__init__.py b/src/bindings/python/tests/test_ngraph/__init__.py index 4417c13d097..127e415364b 100644 --- a/src/bindings/python/tests/test_ngraph/__init__.py +++ b/src/bindings/python/tests/test_ngraph/__init__.py @@ -1,6 +1,6 @@ # Copyright (C) 2018-2021 Intel Corporation # SPDX-License-Identifier: Apache-2.0 -# ngraph.dll directory path visibility is needed to use _pyngraph module +# ov_runtime.dll directory path visibility is needed to use _pyngraph module # import below causes adding this path to os.environ["PATH"] import openvino # noqa: F401 'imported but unused' diff --git a/src/bindings/python/tests_compatibility/test_ngraph/__init__.py b/src/bindings/python/tests_compatibility/test_ngraph/__init__.py index b274453fb17..e5dacbda431 100644 --- a/src/bindings/python/tests_compatibility/test_ngraph/__init__.py +++ b/src/bindings/python/tests_compatibility/test_ngraph/__init__.py @@ -1,6 +1,6 @@ # Copyright (C) 2018-2021 Intel Corporation # SPDX-License-Identifier: Apache-2.0 -# ngraph.dll directory path visibility is needed to use _pyngraph module +# ov_runtime.dll directory path visibility is needed to use _pyngraph module # import below causes adding this path to os.environ["PATH"] import ngraph # noqa: F401 'imported but unused' diff --git a/src/common/low_precision_transformations/include/low_precision/lpt_visibility.hpp b/src/common/low_precision_transformations/include/low_precision/lpt_visibility.hpp index b000b5fc56f..913f9b06548 100644 --- a/src/common/low_precision_transformations/include/low_precision/lpt_visibility.hpp +++ b/src/common/low_precision_transformations/include/low_precision/lpt_visibility.hpp @@ -18,5 +18,5 @@ # define LP_TRANSFORMATIONS_API OPENVINO_CORE_EXPORTS # else # define LP_TRANSFORMATIONS_API OPENVINO_CORE_IMPORTS -# endif // inference_engine_lp_transformations_EXPORTS +# endif // IMPLEMENT_OPENVINO_API #endif // OPENVINO_STATIC_LIBRARY diff --git a/src/common/transformations/include/transformations_visibility.hpp b/src/common/transformations/include/transformations_visibility.hpp index 77dc8f85f84..692447a19d4 100644 --- a/src/common/transformations/include/transformations_visibility.hpp +++ b/src/common/transformations/include/transformations_visibility.hpp @@ -31,11 +31,11 @@ */ #ifdef OPENVINO_STATIC_LIBRARY -#define TRANSFORMATIONS_API +# define TRANSFORMATIONS_API #else -#ifdef IMPLEMENT_OPENVINO_API -#define TRANSFORMATIONS_API OPENVINO_CORE_EXPORTS -#else -#define TRANSFORMATIONS_API OPENVINO_CORE_IMPORTS -#endif // inference_engine_transformations_EXPORTS -#endif // OPENVINO_STATIC_LIBRARY +# ifdef IMPLEMENT_OPENVINO_API +# define TRANSFORMATIONS_API OPENVINO_CORE_EXPORTS +# else +# define TRANSFORMATIONS_API OPENVINO_CORE_IMPORTS +# endif // IMPLEMENT_OPENVINO_API +#endif // OPENVINO_STATIC_LIBRARY diff --git a/tests/conditional_compilation/test_cc.py b/tests/conditional_compilation/test_cc.py index f42a4a1a877..48ebc7c9294 100644 --- a/tests/conditional_compilation/test_cc.py +++ b/tests/conditional_compilation/test_cc.py @@ -125,7 +125,7 @@ def test_verify(test_id, prepared_models, openvino_ref, artifacts, tolerance=1e- @pytest.mark.dependency(depends=["cc_collect", "minimized_pkg"]) def test_libs_size(test_id, models, openvino_ref, artifacts): # pylint: disable=unused-argument """Test if libraries haven't increased in size after conditional compilation.""" - libraries = ["inference_engine_transformations", "MKLDNNPlugin", "ngraph"] + libraries = ["ov_runtime", "MKLDNNPlugin"] minimized_pkg = artifacts / test_id / "install_pkg" ref_libs_size = get_lib_sizes(openvino_ref, libraries) lib_sizes = get_lib_sizes(minimized_pkg, libraries) diff --git a/tests/fuzz/README.md b/tests/fuzz/README.md index 97a3bcca621..ad70db95d47 100644 --- a/tests/fuzz/README.md +++ b/tests/fuzz/README.md @@ -75,7 +75,7 @@ To build coverage report after fuzz test execution run: ``` llvm-profdata merge -sparse *.profraw -o default.profdata && \ -llvm-cov show ./read_network-fuzzer -object=lib/libinference_engine.so -instr-profile=default.profdata -format=html -output-dir=read_network-coverage +llvm-cov show ./read_network-fuzzer -object=lib/libov_runtime.so -instr-profile=default.profdata -format=html -output-dir=read_network-coverage ``` ## Reproducing findings diff --git a/tests/utils/path_utils.py b/tests/utils/path_utils.py index 8e9864059ad..bd678681de5 100644 --- a/tests/utils/path_utils.py +++ b/tests/utils/path_utils.py @@ -36,15 +36,12 @@ def get_lib_path(lib_name): """Function for getting absolute path in OpenVINO directory to specific lib""" os_name = get_os_name() all_libs = { - 'inference_engine_transformations': { - 'Windows': Path('runtime/bin/intel64/Release/inference_engine_transformations.dll'), - 'Linux': Path('runtime/lib/intel64/libinference_engine_transformations.so')}, 'MKLDNNPlugin': { 'Windows': Path('runtime/bin/intel64/Release/MKLDNNPlugin.dll'), 'Linux': Path('runtime/lib/intel64/libMKLDNNPlugin.so')}, - 'ngraph': { - 'Windows': Path('runtime/bin/intel64/Release/ngraph.dll'), - 'Linux': Path('runtime/lib/intel64/libngraph.so')} + 'ov_runtime': { + 'Windows': Path('runtime/bin/intel64/Release/ov_runtime.dll'), + 'Linux': Path('runtime/lib/intel64/libov_runtime.so')} } return all_libs[lib_name][os_name] diff --git a/tools/deployment_manager/configs/darwin.json b/tools/deployment_manager/configs/darwin.json index 98e1c3c6b67..a8b07728099 100644 --- a/tools/deployment_manager/configs/darwin.json +++ b/tools/deployment_manager/configs/darwin.json @@ -16,14 +16,11 @@ "ie_core": { "group": ["ie"], "files": [ - "runtime/lib/intel64/libinference_engine.dylib", - "runtime/lib/intel64/libinference_engine_transformations.dylib", + "runtime/lib/intel64/libov_runtime.dylib", "runtime/lib/intel64/libinference_engine_preproc.so", "runtime/lib/intel64/libinference_engine_c_api.dylib", "runtime/lib/intel64/libov_hetero_plugin.so", "runtime/lib/intel64/libov_auto_plugin.so", - "runtime/lib/intel64/libngraph.dylib", - "runtime/lib/intel64/libfrontend_common.dylib", "runtime/lib/intel64/libov_ir_frontend.dylib", "runtime/lib/intel64/libov_onnx_frontend.dylib", "runtime/lib/intel64/libov_paddlepaddle_frontend.dylib", @@ -36,7 +33,6 @@ "group": ["ie"], "dependencies" : ["ie_core"], "files": [ - "runtime/lib/intel64/libinference_engine_lp_transformations.dylib", "runtime/lib/intel64/libMKLDNNPlugin.so" ] }, diff --git a/tools/deployment_manager/configs/linux.json b/tools/deployment_manager/configs/linux.json index 5e71b7153e9..2d17f1a788f 100644 --- a/tools/deployment_manager/configs/linux.json +++ b/tools/deployment_manager/configs/linux.json @@ -22,14 +22,11 @@ "ie_core": { "group": ["ie"], "files": [ - "runtime/lib/intel64/libinference_engine.so", - "runtime/lib/intel64/libinference_engine_transformations.so", + "runtime/lib/intel64/libov_runtime.so", "runtime/lib/intel64/libinference_engine_preproc.so", "runtime/lib/intel64/libinference_engine_c_api.so", "runtime/lib/intel64/libov_hetero_plugin.so", "runtime/lib/intel64/libov_auto_plugin.so", - "runtime/lib/intel64/libngraph.so", - "runtime/lib/intel64/libfrontend_common.so", "runtime/lib/intel64/libov_ir_frontend.so", "runtime/lib/intel64/libov_onnx_frontend.so", "runtime/lib/intel64/libov_paddlepaddle_frontend.so", @@ -42,7 +39,6 @@ "group": ["ie"], "dependencies" : ["ie_core"], "files": [ - "runtime/lib/intel64/libinference_engine_lp_transformations.so", "runtime/lib/intel64/libMKLDNNPlugin.so" ] }, @@ -53,7 +49,6 @@ "files": [ "runtime/lib/intel64/cache.json", "runtime/lib/intel64/libov_intel_gpu_plugin.so", - "runtime/lib/intel64/libinference_engine_lp_transformations.so", "install_dependencies/install_NEO_OCL_driver.sh" ] }, diff --git a/tools/deployment_manager/configs/windows.json b/tools/deployment_manager/configs/windows.json index 776a666755a..9cb8d68b379 100644 --- a/tools/deployment_manager/configs/windows.json +++ b/tools/deployment_manager/configs/windows.json @@ -16,14 +16,11 @@ "ie_core": { "group": ["ie"], "files": [ - "runtime/bin/intel64/Release/inference_engine.dll", - "runtime/bin/intel64/Release/inference_engine_transformations.dll", + "runtime/bin/intel64/Release/ov_runtime.dll", "runtime/bin/intel64/Release/inference_engine_preproc.dll", "runtime/bin/intel64/Release/inference_engine_c_api.dll", "runtime/bin/intel64/Release/ov_hetero_plugin.dll", "runtime/bin/intel64/Release/ov_auto_plugin.dll", - "runtime/bin/intel64/Release/ngraph.dll", - "runtime/bin/intel64/Release/frontend_common.dll", "runtime/bin/intel64/Release/ov_ir_frontend.dll", "runtime/bin/intel64/Release/ov_onnx_frontend.dll", "runtime/bin/intel64/Release/ov_paddlepaddle_frontend.dll", @@ -36,7 +33,6 @@ "group": ["ie"], "dependencies" : ["ie_core"], "files": [ - "runtime/bin/intel64/Release/inference_engine_lp_transformations.dll", "runtime/bin/intel64/Release/MKLDNNPlugin.dll" ] }, @@ -46,7 +42,6 @@ "dependencies" : ["ie_core"], "files": [ "runtime/bin/intel64/Release/cache.json", - "runtime/bin/intel64/Release/inference_engine_lp_transformations.dll", "runtime/bin/intel64/Release/ov_intel_gpu_plugin.dll" ] },