Replaced DLDT to OpenVINO (#12580)

* Replaced DLDT to OpenVINO

* Fixed samples build dir

* Fixed build folders
This commit is contained in:
Ilya Churaev 2022-08-17 12:46:07 +04:00 committed by GitHub
parent ac9a80e6c9
commit 8a027f4e42
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
13 changed files with 30 additions and 30 deletions

View File

@ -5,15 +5,15 @@ Inference Engine build infrastructure provides the Inference Engine Developer Pa
Inference Engine Developer Package
------------------------
To automatically generate the Inference Engine Developer Package, run the `cmake` tool during a DLDT build:
To automatically generate the Inference Engine Developer Package, run the `cmake` tool during a OpenVINO build:
```bash
$ mkdir dldt-release-build
$ cd dldt-release-build
$ cmake -DCMAKE_BUILD_TYPE=Release ../dldt
$ mkdir openvino-release-build
$ cd openvino-release-build
$ cmake -DCMAKE_BUILD_TYPE=Release ../openvino
```
Once the commands above are executed, the Inference Engine Developer Package is generated in the `dldt-release-build` folder. It consists of several files:
Once the commands above are executed, the Inference Engine Developer Package is generated in the `openvino-release-build` folder. It consists of several files:
- `InferenceEngineDeveloperPackageConfig.cmake` - the main CMake script which imports targets and provides compilation flags and CMake options.
- `InferenceEngineDeveloperPackageConfig-version.cmake` - a file with a package version.
- `targets_developer.cmake` - an automatically generated file which contains all targets exported from the OpenVINO build tree. This file is included by `InferenceEngineDeveloperPackageConfig.cmake` to import the following targets:
@ -46,7 +46,7 @@ To build a plugin source tree using the Inference Engine Developer Package, run
```cmake
$ mkdir template-plugin-release-build
$ cd template-plugin-release-build
$ cmake -DInferenceEngineDeveloperPackage_DIR=../dldt-release-build ../template-plugin
$ cmake -DInferenceEngineDeveloperPackage_DIR=../openvino-release-build ../template-plugin
```
A common plugin consists of the following components:
@ -83,10 +83,10 @@ if(ENABLE_TESTS)
endif()
```
> **NOTE**: The default values of the `ENABLE_TESTS`, `ENABLE_FUNCTIONAL_TESTS` options are shared via the Inference Engine Developer Package and they are the same as for the main DLDT build tree. You can override them during plugin build using the command below:
> **NOTE**: The default values of the `ENABLE_TESTS`, `ENABLE_FUNCTIONAL_TESTS` options are shared via the Inference Engine Developer Package and they are the same as for the main OpenVINO build tree. You can override them during plugin build using the command below:
```bash
$ cmake -DENABLE_FUNCTIONAL_TESTS=OFF -DInferenceEngineDeveloperPackage_DIR=../dldt-release-build ../template-plugin
$ cmake -DENABLE_FUNCTIONAL_TESTS=OFF -DInferenceEngineDeveloperPackage_DIR=../openvino-release-build ../template-plugin
```
- `src/CMakeLists.txt` to build a plugin shared library from sources:

View File

@ -48,7 +48,7 @@ Inference Engine plugin dynamic library consists of several main components:
> **NOTE**: This documentation is written based on the `Template` plugin, which demonstrates plugin
development details. Find the complete code of the `Template`, which is fully compilable and up-to-date,
at `<dldt source dir>/docs/template_plugin`.
at `<openvino source dir>/docs/template_plugin`.
Detailed guides
-----------------------

View File

@ -44,7 +44,7 @@ To build test binaries together with other build artifacts, use the `make all` c
### How to Extend Inference Engine Plugin Tests
Inference Engine Plugin tests are open for contribution.
Add common test case definitions applicable for all plugins to the `IE::funcSharedTests` target within the DLDT repository. Then, any other plugin supporting corresponding functionality can instantiate the new test.
Add common test case definitions applicable for all plugins to the `IE::funcSharedTests` target within the OpenVINO repository. Then, any other plugin supporting corresponding functionality can instantiate the new test.
All Inference Engine per-layer tests check test layers functionality. They are developed using ov::Model.
as input graphs used by tests. In this case, to test a new layer with layer tests, extend

View File

@ -97,8 +97,8 @@ build_samples.sh
```
Once the build is completed, you can find sample binaries in the following folders:
* C samples: `~/inference_engine_c_samples_build/intel64/Release`
* C++ samples: `~/inference_engine_cpp_samples_build/intel64/Release`
* C samples: `~/openvino_c_samples_build/intel64/Release`
* C++ samples: `~/openvino_cpp_samples_build/intel64/Release`
You can also build the sample applications manually:
@ -108,7 +108,7 @@ You can also build the sample applications manually:
```sh
mkdir build
```
> **NOTE**: If you run the Image Classification verification script during the installation, the C++ samples build directory is created in your home directory: `~/inference_engine_cpp_samples_build/`
> **NOTE**: If you run the Image Classification verification script during the installation, the C++ samples build directory is created in your home directory: `~/openvino_cpp_samples_build/`
2. Go to the created directory:
```sh
@ -149,11 +149,11 @@ build_samples_msvc.bat
By default, the script automatically detects the highest Microsoft Visual Studio version installed on the machine and uses it to create and build a solution for a sample code
Once the build is completed, you can find sample binaries in the following folders:
* C samples: `C:\Users\<user>\Documents\Intel\OpenVINO\inference_engine_c_samples_build\intel64\Release`
* C++ samples: `C:\Users\<user>\Documents\Intel\OpenVINO\inference_engine_cpp_samples_build\intel64\Release`
* C samples: `C:\Users\<user>\Documents\Intel\OpenVINO\openvino_c_samples_build\intel64\Release`
* C++ samples: `C:\Users\<user>\Documents\Intel\OpenVINO\openvino_cpp_samples_build\intel64\Release`
You can also build a generated solution manually. For example, if you want to build C++ sample binaries in Debug configuration, run the appropriate version of the
Microsoft Visual Studio and open the generated solution file from the `C:\Users\<user>\Documents\Intel\OpenVINO\inference_engine_cpp_samples_build\Samples.sln`
Microsoft Visual Studio and open the generated solution file from the `C:\Users\<user>\Documents\Intel\OpenVINO\openvino_cpp_samples_build\Samples.sln`
directory.
### <a name="build_samples_macos"></a>Build the Sample Applications on macOS*
@ -172,8 +172,8 @@ build_samples.sh
```
Once the build is completed, you can find sample binaries in the following folders:
* C samples: `~/inference_engine_c_samples_build/intel64/Release`
* C++ samples: `~/inference_engine_cpp_samples_build/intel64/Release`
* C samples: `~/openvino_c_samples_build/intel64/Release`
* C++ samples: `~/openvino_cpp_samples_build/intel64/Release`
You can also build the sample applications manually:
@ -189,7 +189,7 @@ source setupvars.sh
```sh
mkdir build
```
> **NOTE**: If you ran the Image Classification verification script during the installation, the C++ samples build directory was already created in your home directory: `~/inference_engine_cpp_samples_build/`
> **NOTE**: If you ran the Image Classification verification script during the installation, the C++ samples build directory was already created in your home directory: `~/openvino_cpp_samples_build/`
2. Go to the created directory:
```sh

View File

@ -333,19 +333,19 @@ To run the **Image Classification** code sample with an input image using the IR
.. code-block:: sh
cd ~/inference_engine_cpp_samples_build/intel64/Release
cd ~/openvino_cpp_samples_build/intel64/Release
.. tab:: Windows
.. code-block:: bat
cd %USERPROFILE%\Documents\Intel\OpenVINO\inference_engine_samples_build\intel64\Release
cd %USERPROFILE%\Documents\Intel\OpenVINO\openvino_samples_build\intel64\Release
.. tab:: macOS
.. code-block:: sh
cd ~/inference_engine_cpp_samples_build/intel64/Release
cd ~/openvino_cpp_samples_build/intel64/Release
@endsphinxdirective

View File

@ -15,7 +15,7 @@ usage() {
}
samples_type=$(basename "$( dirname "${BASH_SOURCE[0]-$0}" )" )
build_dir="$HOME/inference_engine_${samples_type}_samples_build"
build_dir="$HOME/openvino_${samples_type}_samples_build"
sample_install_dir=""
# parse command line options

View File

@ -8,7 +8,7 @@ SETLOCAL EnableDelayedExpansion
set "ROOT_DIR=%~dp0"
FOR /F "delims=\" %%i IN ("%ROOT_DIR%") DO set SAMPLES_TYPE=%%~nxi
set "SAMPLE_BUILD_DIR=%USERPROFILE%\Documents\Intel\OpenVINO\inference_engine_%SAMPLES_TYPE%_samples_build"
set "SAMPLE_BUILD_DIR=%USERPROFILE%\Documents\Intel\OpenVINO\openvino_%SAMPLES_TYPE%_samples_build"
set SAMPLE_INSTALL_DIR=
:: command line arguments parsing

View File

@ -205,7 +205,7 @@ enum colorformat_e {
RGB, ///< RGB color format
BGR, ///< BGR color format, default in DLDT
BGR, ///< BGR color format, default in OpenVINO
RGBX, ///< RGBX color format with X ignored during inference

View File

@ -213,7 +213,7 @@ typedef struct tensor_desc {
typedef enum {
RAW = 0u, //!< Plain blob (default), no extra color processing required
RGB, //!< RGB color format
BGR, //!< BGR color format, default in DLDT
BGR, //!< BGR color format, default in OpenVINO
RGBX, //!< RGBX color format with X ignored during inference
BGRX, //!< BGRX color format with X ignored during inference
NV12, //!< NV12 color format represented as compound Y+UV blob

View File

@ -142,7 +142,7 @@ inline std::ostream& operator<<(std::ostream& out, const Layout& p) {
enum ColorFormat : uint32_t {
RAW = 0u, ///< Plain blob (default), no extra color processing required
RGB, ///< RGB color format
BGR, ///< BGR color format, default in DLDT
BGR, ///< BGR color format, default in OpenVINO
RGBX, ///< RGBX color format with X ignored during inference
BGRX, ///< BGRX color format with X ignored during inference
NV12, ///< NV12 color format represented as compound Y+UV blob

View File

@ -20,7 +20,7 @@ struct convert_color : public primitive_base<convert_color> {
enum color_format : uint32_t {
RGB, ///< RGB color format
BGR, ///< BGR color format, default in DLDT
BGR, ///< BGR color format, default in OpenVINO
RGBX, ///< RGBX color format with X ignored during inference
BGRX, ///< BGRX color format with X ignored during inference
NV12, ///< NV12 color format represented as compound Y+UV blob

View File

@ -594,7 +594,7 @@ enum class BoxEncodingType {
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
enum class color_format : uint32_t {
RGB, ///< RGB color format
BGR, ///< BGR color format, default in DLDT
BGR, ///< BGR color format, default in OpenVINO
RGBX, ///< RGBX color format with X ignored during inference
BGRX, ///< BGRX color format with X ignored during inference
NV12, ///< NV12 color format represented as compound Y+UV blob

View File

@ -52,7 +52,7 @@ This is OpenVINO Inference Engine testing framework. OpenVINO Inference Engine t
developers just add required test instantiations based on the linked test definitions to own test binary. It should
be done to make all the **shared** test cases always visible and available to instantiate by other plugins.
> **NOTE**: Any new plugin test case should be added to the common test definitions library
(`funcSharedTests`) within the DLDT repository first. And then this test case can be instantiated with the
(`funcSharedTests`) within the OpenVINO repository first. And then this test case can be instantiated with the
required parameters inside own plugin's test binary which links this shared tests library.
> **NOTE**: `funcSharedTests` library is added to the developer package and available for closed source