[80085] New images for docs (#11114)

* change doc structure

* fix manager tools

* fix manager tools 3 step

* fix manager tools 3 step

* new img

* new img for OV Runtime

* fix steps

* steps

* fix intendents

* change list

* fix space

* fix space

* code snippets fix

* change display
This commit is contained in:
Tatiana Savina
2022-03-22 19:34:45 +03:00
committed by GitHub
parent 26d3895331
commit 856575939d
11 changed files with 87 additions and 82 deletions

View File

@@ -1,4 +1,4 @@
# Converting a ONNX* Model {#openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_ONNX}
# Converting an ONNX Model {#openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_ONNX}
## Introduction to ONNX
[ONNX*](https://github.com/onnx/onnx) is a representation format for deep learning models. ONNX allows AI developers easily transfer models between different frameworks that helps to choose the best combination for them. Today, PyTorch\*, Caffe2\*, Apache MXNet\*, Microsoft Cognitive Toolkit\* and other tools are developing ONNX support.

View File

@@ -37,52 +37,55 @@ There are two ways to create a deployment package that includes inference-relate
Interactive mode provides a user-friendly command-line interface that will guide you through the process with text prompts.
1. To launch the Deployment Manager in interactive mode, open a new terminal window, go to the Deployment Manager tool directory and run the tool script without parameters:
To launch the Deployment Manager in interactive mode, open a new terminal window, go to the Deployment Manager tool directory and run the tool script without parameters:
@sphinxdirective
.. tab:: Linux
.. code-block:: sh
cd <INSTALL_DIR>/tools/deployment_manager
./deployment_manager.py
.. tab:: Windows
.. code-block:: bat
cd <INSTALL_DIR>\tools\deployment_manager
.\deployment_manager.py
.. tab:: macOS
.. code-block:: sh
cd <INSTALL_DIR>/tools/deployment_manager
./deployment_manager.py
.. tab:: Linux
.. code-block:: sh
cd <INSTALL_DIR>/tools/deployment_manager
./deployment_manager.py
.. tab:: Windows
.. code-block:: bat
cd <INSTALL_DIR>\deployment_tools\tools\deployment_manager
.\deployment_manager.py
.. tab:: macOS
.. code-block:: sh
cd <INSTALL_DIR>/tools/deployment_manager
./deployment_manager.py
@endsphinxdirective
2. The target device selection dialog is displayed:
![Deployment Manager selection dialog](../../img/selection_dialog.png)
The target device selection dialog is displayed:
![Deployment Manager selection dialog](../img/selection_dialog.png)
Use the options provided on the screen to complete selection of the target devices and press **Enter** to proceed to the package generation dialog. if you want to interrupt the generation process and exit the program, type **q** and press **Enter**.
Use the options provided on the screen to complete selection of the target devices and press **Enter** to proceed to the package generation dialog. if you want to interrupt the generation process and exit the program, type **q** and press **Enter**.
3. Once you accept the selection, the package generation dialog is displayed:
![Deployment Manager configuration dialog](../../img/configuration_dialog.png)
Once you accept the selection, the package generation dialog is displayed:
![Deployment Manager configuration dialog](../img/configuration_dialog.png)
The target devices you have selected at the previous step appear on the screen. To go back and change the selection, type **b** and press **Enter**. Use the options provided to configure the generation process, or use the default settings.
The target devices you have selected at the previous step appear on the screen. To go back and change the selection, type **b** and press **Enter**. Use the options provided to configure the generation process, or use the default settings.
* `o. Change output directory` (optional): Path to the output directory. By default, it's set to your home directory.
* `o. Change output directory` (optional): Path to the output directory. By default, it's set to your home directory.
* `u. Provide (or change) path to folder with user data` (optional): Path to a directory with user data (IRs, models, datasets, etc.) files and subdirectories required for inference, which will be added to the deployment archive. By default, it's set to `None`, which means you will separately copy the user data to the target system.
* `u. Provide (or change) path to folder with user data` (optional): Path to a directory with user data (IRs, models, datasets, etc.) files and subdirectories required for inference, which will be added to the deployment archive. By default, it's set to `None`, which means you will separately copy the user data to the target system.
* `t. Change archive name` (optional): Deployment archive name without extension. By default, it is set to `openvino_deployment_package`.
4. Once all the parameters are set, type **g** and press **Enter** to generate the package for the selected target devices. To interrupt the generation process and exit the program, type **q** and press **Enter**.
The script successfully completes and the deployment package is generated in the specified output directory.
* `t. Change archive name` (optional): Deployment archive name without extension. By default, it is set to `openvino_deployment_package`.
Once all the parameters are set, type **g** and press **Enter** to generate the package for the selected target devices. To interrupt the generation process and exit the program, type **q** and press **Enter**.
The script successfully completes and the deployment package is generated in the specified output directory.
@sphinxdirective
@@ -189,36 +192,38 @@ To deploy the OpenVINO Runtime components from the development machine to the ta
* `install_dependencies` — Snapshot of the `install_dependencies` directory from the OpenVINO installation directory.
* `<user_data>` — The directory with the user data (IRs, datasets, etc.) you specified while configuring the package.
3. For Linux, to run inference on a target Intel® GPU, Intel® Movidius™ VPU, or Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, you need to install additional dependencies by running the `install_openvino_dependencies.sh` script on the target machine:
```sh
cd <destination_dir>/openvino/install_dependencies
sudo -E ./install_openvino_dependencies.sh
```
For Linux, to run inference on a target Intel® GPU, Intel® Movidius™ VPU, or Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, you need to install additional dependencies by running the `install_openvino_dependencies.sh` script on the target machine:
4. Set up the environment variables:
@sphinxdirective
.. tab:: Linux
.. code-block:: sh
cd <destination_dir>/openvino/
source ./setupvars.sh
.. tab:: Windows
.. code-block:: bat
cd <destination_dir>\openvino\
.\setupvars.bat
.. tab:: macOS
.. code-block:: sh
cd <destination_dir>/openvino/
source ./setupvars.sh
```sh
cd <destination_dir>/openvino/install_dependencies
sudo -E ./install_openvino_dependencies.sh
```
Set up the environment variables:
@sphinxdirective
.. tab:: Linux
.. code-block:: sh
cd <destination_dir>/openvino/
source ./setupvars.sh
.. tab:: Windows
.. code-block:: bat
cd <destination_dir>\openvino\
.\setupvars.bat
.. tab:: macOS
.. code-block:: sh
cd <destination_dir>/openvino/
source ./setupvars.sh
@endsphinxdirective
You have now finished the deployment of the OpenVINO Runtime components to the target system.

View File

@@ -1,3 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:25ed719bdd525dc0b606ef17a3fec5303ea032dfe6b2d167e1b19b6100b6fb37
size 16516
oid sha256:9ba2a85ae6c93405f9b6e11c3c41ab20ffe13e8ae64403fa9802af6d96b314b1
size 35008

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:50c191b2949e981811fbdd009b4138d88d2731432f308de010757058061cacbe
size 37171

View File

@@ -1,3 +0,0 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6a3d820b43de20a74d857cb720783c1b579624afde26a2bb4bb097ba8fd0bd79
size 2052

View File

@@ -1,3 +0,0 @@
version https://git-lfs.github.com/spec/v1
oid sha256:04e954b5e1501f958ea2c03303760786ca7a57aaf6de335cb936750c675e6107
size 1626

View File

@@ -43,8 +43,8 @@
workbench_docs_Workbench_DG_Introduction
workbench_docs_Workbench_DG_Install
workbench_docs_Workbench_DG_Work_with_Models_and_Sample_Datasets
workbench_docs_Workbench_DG_User_Guide
workbench_docs_security_Workbench
Tutorials <workbench_docs_Workbench_DG_Tutorials>
User Guide <workbench_docs_Workbench_DG_User_Guide>
workbench_docs_Workbench_DG_Troubleshooting
.. toctree::

View File

@@ -1,3 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a9a30b2cc5ca8ebe2da122247e292a9b415beb7bb6fbfd88f6843061d81a9e83
size 29381
oid sha256:2d6db31aee32fc54a0c58fff77aca191070da87a85148998ed837e81cd3b708e
size 42540

View File

@@ -1,3 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:25ed719bdd525dc0b606ef17a3fec5303ea032dfe6b2d167e1b19b6100b6fb37
size 16516
oid sha256:9ba2a85ae6c93405f9b6e11c3c41ab20ffe13e8ae64403fa9802af6d96b314b1
size 35008

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cfa6b834abf8d7add9877791f5340126c77ba6df6f7b026ecd96576af2e16816
size 53871

View File

@@ -1,3 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aee73cd3275e6aaeb13a3df843ce23889cadc6e7e4d031349de7c4dfe851c2f5
size 25629
oid sha256:0812f173a2fca3a3fce86d5b1df36e4d956c35bb09fcadbab0f26f17ccc97b5e
size 43417