[docs] update linux/win installation guide (#9720)
* CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide CVS-71745 update linux installation guide * CVS-71745 update linux installation guide * CVS-71745 update linux installation guide * CVS-71745 update linux installation guide * lfs * CVS-71745 update linux installation guide * CVS-71745 update linux installation guide Co-authored-by: CCR\ntyukaev <nikolay.tyukaev@intel.com>
This commit is contained in:
parent
5e8f997262
commit
56759d9cdc
@ -2,6 +2,8 @@
|
|||||||
|
|
||||||
@sphinxdirective
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _code samples:
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
:hidden:
|
:hidden:
|
||||||
|
3
docs/_static/images/openvino-install.png
vendored
Normal file
3
docs/_static/images/openvino-install.png
vendored
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:ef87640e224de61f41e76541e22a1392c84827dd0b7f70f3c616d86e75456aef
|
||||||
|
size 8508
|
@ -5,40 +5,33 @@
|
|||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
:hidden:
|
:hidden:
|
||||||
:caption: Install From Release Packages
|
:caption: Install Intel® Distribution of OpenVINO™ Toolkit
|
||||||
|
|
||||||
Linux <openvino_docs_install_guides_installing_openvino_linux>
|
|
||||||
Windows <openvino_docs_install_guides_installing_openvino_windows>
|
|
||||||
macOS <openvino_docs_install_guides_installing_openvino_macos>
|
|
||||||
Raspbian OS <openvino_docs_install_guides_installing_openvino_raspbian>
|
|
||||||
Uninstalling <openvino_docs_install_guides_uninstalling_openvino>
|
|
||||||
|
|
||||||
|
|
||||||
.. toctree::
|
|
||||||
:maxdepth: 1
|
|
||||||
:hidden:
|
|
||||||
:caption: Install From Images and Repositories
|
|
||||||
|
|
||||||
Overview <openvino_docs_install_guides_installing_openvino_images>
|
Overview <openvino_docs_install_guides_installing_openvino_images>
|
||||||
PIP<openvino_docs_install_guides_installing_openvino_pip>
|
Linux <openvino_docs_install_guides_installing_openvino_linux_header>
|
||||||
Docker for Linux <openvino_docs_install_guides_installing_openvino_docker_linux>
|
Windows <openvino_docs_install_guides_installing_openvino_windows_header>
|
||||||
Docker for Windows <openvino_docs_install_guides_installing_openvino_docker_windows>
|
macOS <openvino_docs_install_guides_installing_openvino_macos>
|
||||||
|
Raspbian OS <openvino_docs_install_guides_installing_openvino_raspbian>
|
||||||
|
PIP<openvino_docs_install_guides_installing_openvino_pip>
|
||||||
Docker with DL Workbench <workbench_docs_Workbench_DG_Run_Locally>
|
Docker with DL Workbench <workbench_docs_Workbench_DG_Run_Locally>
|
||||||
APT <openvino_docs_install_guides_installing_openvino_apt>
|
|
||||||
YUM <openvino_docs_install_guides_installing_openvino_yum>
|
|
||||||
Conda <openvino_docs_install_guides_installing_openvino_conda>
|
Conda <openvino_docs_install_guides_installing_openvino_conda>
|
||||||
Yocto <openvino_docs_install_guides_installing_openvino_yocto>
|
Yocto <openvino_docs_install_guides_installing_openvino_yocto>
|
||||||
|
Install OpenVINO Model Development Tools <installing_model_dev_tools>
|
||||||
Build from Source <https://github.com/openvinotoolkit/openvino/wiki/BuildingCode>
|
Build from Source <https://github.com/openvinotoolkit/openvino/wiki/BuildingCode>
|
||||||
|
Uninstalling <openvino_docs_install_guides_uninstalling_openvino>
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
:hidden:
|
:hidden:
|
||||||
:caption: Configuration for Hardware
|
:caption: Configure Intel® Distribution of OpenVINO™ Toolkit
|
||||||
|
|
||||||
Configure Intel® Vision Accelerator Design with Intel® Movidius™ VPUs on Linux*<openvino_docs_install_guides_installing_openvino_linux_ivad_vpu>
|
Configure Intel® Vision Accelerator Design with Intel® Movidius™ VPUs on Linux*<openvino_docs_install_guides_installing_openvino_linux_ivad_vpu>
|
||||||
|
Configure Intel® Vision Accelerator Design with Intel® Movidius™ VPUs on Windows*<openvino_docs_install_guides_installing_openvino_windows_ivad_vpu>
|
||||||
Intel® Movidius™ VPUs Setup Guide <openvino_docs_install_guides_movidius_setup_guide>
|
Intel® Movidius™ VPUs Setup Guide <openvino_docs_install_guides_movidius_setup_guide>
|
||||||
Intel® Movidius™ VPUs Programming Guide <openvino_docs_install_guides_movidius_programming_guide>
|
Intel® Movidius™ VPUs Programming Guide <openvino_docs_install_guides_movidius_programming_guide>
|
||||||
|
Intel® Movidius™ VPUs Demos <openvino_docs_install_guides_movidius_demos>
|
||||||
|
Intel® GPU Setup Guide <openvino_docs_install_guides_gpu_setup_guide>
|
||||||
|
Intel® Neural Compute Stick 2 Setup Guide <openvino_docs_install_guides_ncs2_setup_guide>
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
|
@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:2809bbcc91bd2495c670c401bcb0a1e55c80e2075af78444169df3c8b1c86a64
|
||||||
|
size 532493
|
@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:57a70b183a12fe35a7c9aa6532af6e17d8690d2516d92b2a76a081f8b172de99
|
||||||
|
size 643430
|
65
docs/install_guides/gpu-setup-guide.md
Normal file
65
docs/install_guides/gpu-setup-guide.md
Normal file
@ -0,0 +1,65 @@
|
|||||||
|
# Set Up Intel® GPU for Use with Intel® Distribution of OpenVINO™ toolkit {#openvino_docs_install_guides_gpu_setup_guide}
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _gpu guide:
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
## Linux
|
||||||
|
|
||||||
|
Once you have your Intel® Distribution of OpenVINO™ Toolkit installed, follow the steps to be able to work on GPU:
|
||||||
|
|
||||||
|
1. Go to the install_dependencies directory:
|
||||||
|
```sh
|
||||||
|
cd <INSTALL_DIR>/intel/openvino_2022/install_dependencies/
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Install the **Intel® Graphics Compute Runtime for OpenCL™** driver components required to use the GPU plugin and write custom layers for Intel® Integrated Graphics. The drivers are not included in the package. To install, run this script:
|
||||||
|
```sh
|
||||||
|
sudo -E ./install_NEO_OCL_driver.sh
|
||||||
|
```
|
||||||
|
> **NOTE**: To use the **Intel® Iris® Xe MAX Graphics**, see the [Intel® Iris® Xe MAX Graphics with Linux*](https://dgpu-docs.intel.com/devices/iris-xe-max-graphics/index.html) page for driver installation instructions.
|
||||||
|
|
||||||
|
The script compares the driver version on the system to the current version. If the driver version on the system is higher or equal to the current version, the script does
|
||||||
|
not install a new driver. If the version of the driver is lower than the current version, the script uninstalls the lower version and installs the current version with your permission:
|
||||||
|

|
||||||
|
|
||||||
|
Higher hardware versions require a higher driver version, namely 20.35 instead of 19.41. If the script fails to uninstall the driver, uninstall it manually. During the script execution, you may see the following command line output:
|
||||||
|
```sh
|
||||||
|
Add OpenCL user to video group
|
||||||
|
```
|
||||||
|
Ignore this suggestion and continue.<br>
|
||||||
|
You can also find the most recent version of the driver, installation procedure and other information on the [Intel® software for general purpose GPU capabilities](https://dgpu-docs.intel.com/index.html) site.
|
||||||
|
|
||||||
|
3. **Optional:** Install header files to allow compilation of new code. You can find the header files at [Khronos OpenCL™ API Headers](https://github.com/KhronosGroup/OpenCL-Headers.git).
|
||||||
|
|
||||||
|
You've completed all required configuration steps to perform inference on processor graphics.
|
||||||
|
Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _gpu guide windows:
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
## Windows
|
||||||
|
|
||||||
|
This section will help you check if you require driver installation. Install indicated version or higher.
|
||||||
|
|
||||||
|
If your applications offload computation to **Intel® Integrated Graphics**, you must have the Intel Graphics Driver for Windows installed on your hardware.
|
||||||
|
[Download and install the recommended version](https://downloadcenter.intel.com/download/30079/Intel-Graphics-Windows-10-DCH-Drivers).
|
||||||
|
|
||||||
|
To check if you have this driver installed:
|
||||||
|
|
||||||
|
1. Type **device manager** in your **Search Windows** box and press Enter. The **Device Manager** opens.
|
||||||
|
|
||||||
|
2. Click the drop-down arrow to view the **Display adapters**. You can see the adapter that is installed in your computer:
|
||||||
|

|
||||||
|
|
||||||
|
3. Right-click the adapter name and select **Properties**.
|
||||||
|
|
||||||
|
4. Click the **Driver** tab to see the driver version.
|
||||||
|

|
||||||
|
|
||||||
|
You are done updating your device driver and are ready to use your GPU. Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
1
docs/install_guides/installing-model-dev-tools.md
Normal file
1
docs/install_guides/installing-model-dev-tools.md
Normal file
@ -0,0 +1 @@
|
|||||||
|
# Placeholder for Installing Model Dev Tools {#installing_model_dev_tools}
|
23
docs/install_guides/installing-openvino-linux-header.md
Normal file
23
docs/install_guides/installing-openvino-linux-header.md
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
# Install and Configure Intel® Distribution of OpenVINO™ toolkit for Linux* {#openvino_docs_install_guides_installing_openvino_linux_header}
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 2
|
||||||
|
:hidden:
|
||||||
|
|
||||||
|
Installer <openvino_docs_install_guides_installing_openvino_linux>
|
||||||
|
APT <openvino_docs_install_guides_installing_openvino_apt>
|
||||||
|
YUM <openvino_docs_install_guides_installing_openvino_yum>
|
||||||
|
Docker <openvino_docs_install_guides_installing_openvino_docker_linux>
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
If you want to install Intel® Distribution of OpenVINO™ toolkit on your Linux machine, there are a few ways to accomplish this. We prepared following options for you:
|
||||||
|
|
||||||
|
* [Install Intel® Distribution of OpenVINO™ with an Installer](installing-openvino-linux.md)
|
||||||
|
* [Install Intel® Distribution of OpenVINO™ with yum](installing-openvino-yum.md)
|
||||||
|
* [Install Intel® Distribution of OpenVINO™ with apt](installing-openvino-apt.md)
|
||||||
|
* [Install Intel® Distribution of OpenVINO™ with Docker](installing-openvino-docker-linux.md)
|
||||||
|
|
||||||
|
Enjoy your journey with OpenVINO™.
|
@ -1,4 +1,10 @@
|
|||||||
# Configuration Guide for the Intel® Distribution of OpenVINO™ toolkit and the Intel® Vision Accelerator Design with Intel® Movidius™ VPUs on Linux* {#openvino_docs_install_guides_installing_openvino_linux_ivad_vpu}
|
# Set Up Intel® Vision Accelerator Design with Intel® Movidius™ VPUs for Use with Intel® Distribution of OpenVINO™ toolkit {#openvino_docs_install_guides_installing_openvino_linux_ivad_vpu}
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _vpu guide:
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
> **NOTES**:
|
> **NOTES**:
|
||||||
> - These steps are only required if you want to perform inference on Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
> - These steps are only required if you want to perform inference on Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||||
|
@ -1,52 +1,40 @@
|
|||||||
# Install and Configure Intel® Distribution of OpenVINO™ toolkit for Linux* {#openvino_docs_install_guides_installing_openvino_linux}
|
# Install and Configure Intel® Distribution of OpenVINO™ toolkit for Linux {#openvino_docs_install_guides_installing_openvino_linux}
|
||||||
|
|
||||||
> **NOTE**: These steps apply to Ubuntu\* and with some modifications shown below, also Red Hat\* Enterprise Linux\*.
|
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. These tools are now only available on [pypi.org](https://pypi.org/project/openvino-dev/).
|
||||||
|
|
||||||
## Introduction
|
|
||||||
|
|
||||||
By default, the [OpenVINO™ Toolkit](https://docs.openvino.ai/latest/index.html) installation on this page installs the following components:
|
|
||||||
|
|
||||||
| Component | Description |
|
|
||||||
|-----------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
|
||||||
| [Model Optimizer](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md) | This tool imports, converts, and optimizes models that were trained in popular frameworks to a format usable by Intel tools, especially the Inference Engine. <br> Popular frameworks include Caffe\*, TensorFlow\*, MXNet\*, and ONNX\*. |
|
|
||||||
| [Inference Engine](../IE_DG/Deep_Learning_Inference_Engine_DevGuide.md) | This is the engine that runs the deep learning model. It includes a set of libraries for an easy inference integration into your applications. |
|
|
||||||
| Intel® Media SDK | Offers access to hardware accelerated video codecs and frame processing |
|
|
||||||
| [OpenCV\*](https://docs.opencv.org/master/) | OpenCV\* community version compiled for Intel® hardware |
|
|
||||||
| [Inference Engine Code Samples](../IE_DG/Samples_Overview.md) | A set of simple command-line applications demonstrating how to utilize specific OpenVINO capabilities in an application and how to perform specific tasks, such as loading a model, running inference, querying specific device capabilities, and more. |
|
|
||||||
| [Demo Applications](@ref omz_demos) | A set of command-line applications that serve as robust templates to help you implement multi-stage pipelines and specific deep learning scenarios. |
|
|
||||||
| Additional Tools | A set of tools to work with your models including [Accuracy Checker utility](@ref omz_tools_accuracy_checker), [Post-Training Optimization Tool](@ref pot_README), [Model Downloader](@ref omz_tools_downloader) and others |
|
|
||||||
| [Documentation for Pre-Trained Models ](@ref omz_models_group_intel) | Documentation for the pre-trained models available in the [Open Model Zoo repo](https://github.com/openvinotoolkit/open_model_zoo). |
|
|
||||||
| Deep Learning Streamer (DL Streamer) | Streaming analytics framework, based on GStreamer, for constructing graphs of media analytics components. For the DL Streamer documentation, see [DL Streamer Samples](@ref gst_samples_README), [API Reference](https://openvinotoolkit.github.io/dlstreamer_gst/), [Elements](https://github.com/openvinotoolkit/dlstreamer_gst/wiki/Elements), [Tutorial](https://github.com/openvinotoolkit/dlstreamer_gst/wiki/DL-Streamer-Tutorial). |
|
|
||||||
|
|
||||||
## System Requirements
|
## System Requirements
|
||||||
|
|
||||||
**Hardware**
|
@sphinxdirective
|
||||||
|
.. tab:: Operating Systems
|
||||||
|
|
||||||
Optimized for these processors:
|
* Ubuntu 18.04.x long-term support (LTS), 64-bit
|
||||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
* Ubuntu 20.04.x long-term support (LTS), 64-bit
|
||||||
* 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
|
* Red Hat Enterprise Linux 8.x, 64-bit
|
||||||
* Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
|
|
||||||
* Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1)
|
|
||||||
* Intel Pentium® processor N4200/5, N3350/5, or N3450/5 with Intel® HD Graphics
|
|
||||||
* Intel® Iris® Xe MAX Graphics
|
|
||||||
* Intel® Neural Compute Stick 2
|
|
||||||
* Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
|
|
||||||
|
|
||||||
> **NOTE**: Since the OpenVINO™ 2020.4 release, Intel® Movidius™ Neural Compute Stick is not supported.
|
.. note::
|
||||||
|
Since the OpenVINO™ 2022.1 release, CentOS 7.6, 64-bit is not longer supported.
|
||||||
|
|
||||||
**Processor Notes**
|
.. tab:: Hardware
|
||||||
|
|
||||||
- Processor graphics are not included in all processors. See [Product Specifications](https://ark.intel.com/) for information about your processor.
|
Optimized for these processors:
|
||||||
|
|
||||||
**Operating Systems**
|
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||||
|
* 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
|
||||||
|
* Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
|
||||||
|
* Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1)
|
||||||
|
* Intel Pentium® processor N4200/5, N3350/5, or N3450/5 with Intel® HD Graphics
|
||||||
|
* Intel® Iris® Xe MAX Graphics
|
||||||
|
* Intel® Neural Compute Stick 2
|
||||||
|
* Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
|
||||||
|
|
||||||
- Ubuntu 18.04.x long-term support (LTS), 64-bit
|
.. tab:: Processor Notes
|
||||||
- Ubuntu 20.04.0 long-term support (LTS), 64-bit
|
|
||||||
- CentOS 7.6, 64-bit (for deployment only, not development)
|
Processor graphics are not included in all processors.
|
||||||
- For deployment on Red Hat* Enterprise Linux* 8.2, 64-bit, you can use the Intel® Distribution of OpenVINO™ toolkit runtime package that includes the Inference Engine core libraries, nGraph, OpenCV, Python bindings, and CPU and GPU plugins. The package is available as:
|
See `Product Specifications`_ for information about your processor.
|
||||||
- [Downloadable archive](https://storage.openvinotoolkit.org/repositories/openvino/packages/2021.4.1/l_openvino_toolkit_runtime_rhel8_p_2021.4.689.tgz)
|
|
||||||
- [PyPi package](https://pypi.org/project/openvino/)
|
.. _Product Specifications: https://ark.intel.com/
|
||||||
- [Docker image](https://catalog.redhat.com/software/containers/intel/openvino-runtime/606ff4d7ecb5241699188fb3)
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
@ -55,94 +43,79 @@ This guide provides step-by-step instructions on how to install the Intel® Dist
|
|||||||
1. <a href="#install-openvino">Install the Intel® Distribution of OpenVINO™ Toolkit</a>
|
1. <a href="#install-openvino">Install the Intel® Distribution of OpenVINO™ Toolkit</a>
|
||||||
2. <a href="#install-external-dependencies">Install External Software Dependencies</a>
|
2. <a href="#install-external-dependencies">Install External Software Dependencies</a>
|
||||||
3. <a href="#set-the-environment-variables">Configure the Environment</a>
|
3. <a href="#set-the-environment-variables">Configure the Environment</a>
|
||||||
4. <a href="#model-optimizer">Configure the Model Optimizer</a>
|
4. <a href="#model-optimizer">Download additional components (Optional)</a>
|
||||||
5. <a href="#optional-steps">Configure Inference on non-CPU Devices (Optional):</a>
|
5. <a href="#optional-steps">Configure Inference on non-CPU Devices (Optional)</a>
|
||||||
- <a href="#install-gpu">Steps for Intel® Processor Graphics (GPU)</a>
|
6. <a href="#get-started">What's next?</a>
|
||||||
- <a href="#install-ncs2">Steps for Intel® Neural Compute Stick 2</a>
|
|
||||||
- <a href="#install-vpu">Steps for Intel® Vision Accelerator Design with Intel® Movidius™ VPU</a><br>
|
|
||||||
After installing your Intel® Movidius™ VPU, you will return to this guide to complete OpenVINO™ installation.
|
|
||||||
6. <a href="#get-started">Start Using the Toolkit</a>
|
|
||||||
|
|
||||||
- [Steps to uninstall the Intel® Distribution of OpenVINO™ Toolkit](uninstalling-openvino.md)
|
@sphinxdirective
|
||||||
|
|
||||||
## <a name="install-openvino"></a>Step 1: Install the Intel® Distribution of OpenVINO™ Toolkit Core Components
|
.. important::
|
||||||
|
Before you start your journey with installation of the Intel® Distribution of OpenVINO™, we encourage you to check up our :ref:`code samples <code samples>` in C, C++, and Python and :ref:`notebook tutorials <notebook tutorials>` that we prepared for you, so you could see all amazing things that you can achieve with our tool.
|
||||||
|
|
||||||
1. Download the Intel® Distribution of OpenVINO™ toolkit package file from [Intel® Distribution of OpenVINO™ toolkit for Linux*](https://software.intel.com/en-us/openvino-toolkit/choose-download).
|
@endsphinxdirective
|
||||||
Select the Intel® Distribution of OpenVINO™ toolkit for Linux package from the dropdown menu.
|
|
||||||
|
|
||||||
|
## <a name="install-openvino"></a>Step 1: Install the Intel® Distribution of OpenVINO™ Toolkit
|
||||||
|
|
||||||
|
1. Select and download the Intel® Distribution of OpenVINO™ toolkit installer file from [Intel® Distribution of OpenVINO™ toolkit for Linux*](https://software.intel.com/en-us/openvino-toolkit/choose-download).
|
||||||
2. Open a command prompt terminal window. You can use the keyboard shortcut: Ctrl+Alt+T
|
2. Open a command prompt terminal window. You can use the keyboard shortcut: Ctrl+Alt+T
|
||||||
3. Change directories to where you downloaded the Intel Distribution of
|
3. Change directories to where you downloaded the Intel Distribution of OpenVINO™ toolkit for Linux file.<br>
|
||||||
OpenVINO toolkit for Linux\* package file.<br>
|
If you downloaded the starter script to the current user's `Downloads` directory:
|
||||||
If you downloaded the package file to the current user's `Downloads` directory:
|
|
||||||
```sh
|
```sh
|
||||||
cd ~/Downloads/
|
cd ~/Downloads/
|
||||||
```
|
```
|
||||||
By default, the file is saved as `l_openvino_toolkit_p_<version>.tgz`, e.g., `l_openvino_toolkit_p_2021.4.689.tgz`.
|
You should find there a bootstrapper script `l_openvino_toolkit_p_<version>.sh`.
|
||||||
4. Unpack the .tgz file:
|
4. Add executable rights for the current user:
|
||||||
```sh
|
```sh
|
||||||
tar -xvzf l_openvino_toolkit_p_<version>.tgz
|
chmod +x l_openvino_toolkit_p_<version>.sh
|
||||||
```
|
```
|
||||||
The files are unpacked to the `l_openvino_toolkit_p_<version>` directory.
|
5. If you want to use graphical user interface (GUI) installation wizard, run the script without any parameters:
|
||||||
5. Go to the `l_openvino_toolkit_p_<version>` directory:
|
|
||||||
```sh
|
```sh
|
||||||
cd l_openvino_toolkit_p_<version>
|
./l_openvino_toolkit_p_<version>.sh
|
||||||
```
|
```
|
||||||
If you have a previous version of the Intel Distribution of OpenVINO
|
<br>You should see the following dialog box open up:
|
||||||
toolkit installed, rename or delete these two directories:
|
|
||||||
- `~/inference_engine_samples_build`
|
|
||||||
- `~/openvino_models`
|
|
||||||
|
|
||||||
6. Choose your installation option and run the related script as root to use either a graphical user interface (GUI) installation wizard or command line instructions (CLI).<br>
|
@sphinxdirective
|
||||||
Screenshots are provided for the GUI, but not for CLI. The following information also applies to CLI and will be helpful to your installation, where you will be presented with the same choices and tasks.
|
|
||||||
- **Option 1:** GUI Installation Wizard:
|
.. image:: _static/images/openvino-install.png
|
||||||
```sh
|
:width: 400px
|
||||||
sudo ./install_GUI.sh
|
:align: center
|
||||||
```
|
|
||||||
- **Option 2:** Command Line Instructions:
|
@endsphinxdirective
|
||||||
```sh
|
|
||||||
sudo ./install.sh
|
|
||||||
```
|
|
||||||
- **Option 3:** Command Line Silent Instructions:
|
|
||||||
```sh
|
|
||||||
sudo sed -i 's/decline/accept/g' silent.cfg
|
|
||||||
sudo ./install.sh -s silent.cfg
|
|
||||||
```
|
|
||||||
In Option 3 you can select which OpenVINO components will be installed by modifying the `COMPONENTS` parameter in the `silent.cfg` file. For example, to install only CPU runtime for the Inference Engine, set `COMPONENTS=intel-openvino-ie-rt-cpu__x86_64` in `silent.cfg`. To get a full list of available components for installation, run the `./install.sh --list_components` command from the unpacked OpenVINO™ toolkit package.
|
|
||||||
|
|
||||||
7. Follow the instructions on your screen. Watch for informational messages such as the following in case you must complete additional steps:
|
Otherwise, you can add parameters `-a` for additional arguments and `--cli` to run installation in command line (CLI):
|
||||||

|
```sh
|
||||||
|
./l_openvino_toolkit_p_<version>.sh -a --cli
|
||||||
|
```
|
||||||
|
|
||||||
By default, the Intel® Distribution of OpenVINO™ is installed to the following directory:
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
To get additional information on all parameters that can be used, check up the help option: `--help`. Among others, you can find there `-s` option which offers silent mode, which together with `--eula approve` allows you to run whole installation with default values without any user inference.
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
6. Follow the instructions on your screen. During the installation you will be asked to accept the license agreement. The acceptance is required to continue. Check out the installation process on the image below:<br>
|
||||||
|
|
||||||
|

|
||||||
|
Click on the image to see the details.
|
||||||
|
<br>
|
||||||
|
<br>By default, the Intel® Distribution of OpenVINO™ is installed to the following directory:
|
||||||
* For root or administrator: `/opt/intel/openvino_<version>/`
|
* For root or administrator: `/opt/intel/openvino_<version>/`
|
||||||
* For regular users: `/home/<USER>/intel/openvino_<version>/`
|
* For regular users: `/home/<USER>/intel/openvino_<version>/`
|
||||||
|
|
||||||
For simplicity, a symbolic link to the latest installation is also created: `/opt/intel/openvino_2022/` or `/home/<USER>/intel/openvino_2022/`
|
<br>For simplicity, a symbolic link to the latest installation is also created: `/opt/intel/openvino_2022/` or `/home/<USER>/intel/openvino_2022/`
|
||||||
|
|
||||||
8. **Optional**: You can choose **Customize** to change the installation directory or the components you want to install.
|
To check **Release Notes** please visit: [Release Notes](https://software.intel.com/en-us/articles/OpenVINO-RelNotes)
|
||||||
> **NOTE**: If there is an OpenVINO™ toolkit version previously installed on your system, the installer will use the same destination directory for the next installation. If you want to install a newer version to a different directory, you need to uninstall the previously installed versions.
|
|
||||||
|
|
||||||
> **NOTE**: The Intel® Media SDK component is always installed in the `/opt/intel/mediasdk` directory regardless of the OpenVINO installation path chosen.
|
|
||||||
|
|
||||||
9. The **Finish** screen indicates that the core components have been installed:
|
|
||||||

|
|
||||||
|
|
||||||
Once you click **Finish** to close the installation wizard, a new browser window will open with this documentation. It jumps to the section with your next installation steps.
|
|
||||||
|
|
||||||
The core components are now installed. Continue to the next section to install additional dependencies.
|
The core components are now installed. Continue to the next section to install additional dependencies.
|
||||||
|
|
||||||
## <a name="install-external-dependencies"></a>Step 2: Install External Software Dependencies
|
## <a name="install-external-dependencies"></a>Step 2: Install External Software Dependencies
|
||||||
|
|
||||||
> **NOTE**: If you installed the Intel® Distribution of OpenVINO™ to a non-default directory, replace `/opt/intel` with the directory in which you installed the software.
|
This script allows to install Linux platform development tools and components to work with the product.
|
||||||
|
|
||||||
These dependencies are required for:
|
|
||||||
|
|
||||||
- Intel-optimized build of OpenCV library
|
|
||||||
- Deep Learning Inference Engine
|
|
||||||
- Deep Learning Model Optimizer tools
|
|
||||||
|
|
||||||
1. Go to the `install_dependencies` directory:
|
1. Go to the `install_dependencies` directory:
|
||||||
```sh
|
```sh
|
||||||
cd /opt/intel/openvino_2022/install_dependencies
|
cd <INSTALL_DIR>/intel/openvino_2022/install_dependencies
|
||||||
```
|
```
|
||||||
2. Run a script to download and install the external software dependencies:
|
2. Run a script to download and install the external software dependencies:
|
||||||
```sh
|
```sh
|
||||||
@ -153,184 +126,114 @@ These dependencies are required for:
|
|||||||
|
|
||||||
## <a name="set-the-environment-variables"></a>Step 3: Configure the Environment
|
## <a name="set-the-environment-variables"></a>Step 3: Configure the Environment
|
||||||
|
|
||||||
You must update several environment variables before you can compile and run OpenVINO™ applications. Set persistent environment variables as follows, using vi (as below) or your preferred editor:
|
You must update several environment variables before you can compile and run OpenVINO™ applications. Set environment variables as follows:
|
||||||
|
|
||||||
1. Open the `.bashrc` file in `/home/<USER>`:
|
```sh
|
||||||
```sh
|
source <INSTALL_DIR>/intel/openvino_2022/bin/setupvars.sh
|
||||||
vi ~/.bashrc
|
```
|
||||||
```
|
|
||||||
|
|
||||||
2. Press the **i** key to switch to insert mode.
|
If you have more than one OpenVINO™ version on your machine, you can easily switch its version by sourcing `setupvars.sh` of your choice.
|
||||||
|
|
||||||
3. Add this line to the end of the file:
|
> **NOTE**: You can also run this script every time when you start new terminal session. Open `~/.bashrc` in your favorite editor, and add `source <INSTALL_DIR>/intel/openvino_2022/bin/setupvars.sh`. Next time when you open a terminal, you will see `[setupvars.sh] OpenVINO™ environment initialized`. Changing `.bashrc` is not recommended when you have many OpenVINO™ versions on your machine and want to switch among them, as each may require different setup.
|
||||||
```sh
|
|
||||||
source /opt/intel/openvino_2022/bin/setupvars.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
4. Save and close the file: press the **Esc** key and type `:wq`.
|
The environment variables are set. Next, you can download some additional tools.
|
||||||
|
|
||||||
5. To verify the change, open a new terminal. You will see `[setupvars.sh] OpenVINO environment initialized`.
|
## <a name="model-optimizer">Step 4 (Optional): Download additional components
|
||||||
|
|
||||||
**Optional:** If you don't want to change your shell profile, you can run the following script to temporarily set your environment variables for each terminal instance when working with OpenVINO™:
|
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. The OpenVINO™ Model Development Tools can only be installed via PyPI now. See [Install OpenVINO™ Model Development Tools](@ref installing_model_dev_tools) for detailed steps.
|
||||||
|
|
||||||
```sh
|
## <a name="optional-steps"></a>Step 5 (Optional): Configure Inference on non-CPU Devices
|
||||||
source /opt/intel/openvino_2022/bin/setupvars.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
The environment variables are set. Next, you will configure the Model Optimizer.
|
@sphinxdirective
|
||||||
|
.. tab:: GPU
|
||||||
|
|
||||||
## <a name="model-optimizer">Step 4: Configure the Model Optimizer
|
Only if you want to enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide <gpu guide>`.
|
||||||
|
|
||||||
> **NOTE**: Since the TensorFlow framework is not officially supported on CentOS*, the Model Optimizer for TensorFlow can't be configured and run on that operating system.
|
.. tab:: NCS 2
|
||||||
|
|
||||||
The Model Optimizer is a Python\*-based command line tool for importing
|
Only if you want to perform inference on Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU, follow the steps on :ref:`NCS2 Setup Guide <ncs guide>`.
|
||||||
trained models from popular deep learning frameworks such as Caffe\*,
|
For more details, see the `Get Started page for Intel® Neural Compute Stick 2 <https://software.intel.com/en-us/neural-compute-stick/get-started>`_.
|
||||||
TensorFlow\*, Apache MXNet\*, ONNX\* and Kaldi\*.
|
|
||||||
|
|
||||||
The Model Optimizer is a key component of the Intel Distribution of OpenVINO toolkit. Performing inference on a model
|
.. tab:: VPU
|
||||||
(with the exception of ONNX and nGraph models) requires running the model through the Model Optimizer. When you run a pre-trained
|
|
||||||
model through the Model Optimizer, your output is an Intermediate Representation (IR) of the network. The Intermediate
|
|
||||||
Representation is a pair of files that describe the whole model:
|
|
||||||
|
|
||||||
- `.xml`: Describes the network topology
|
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPUs Configuration Guide <vpu guide>`.
|
||||||
- `.bin`: Contains the weights and biases binary data
|
After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs. Check up our :ref:`Movidius VPU demos <vpu demos>`.
|
||||||
|
|
||||||
For more information about the Model Optimizer, refer to the [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md).
|
.. warning::
|
||||||
|
While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine.
|
||||||
|
|
||||||
1. Go to the Model Optimizer prerequisites directory:
|
@endsphinxdirective
|
||||||
```sh
|
|
||||||
cd /opt/intel/openvino_2022/tools/model_optimizer/install_prerequisites
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Run the script to configure the Model Optimizer for Caffe, TensorFlow, MXNet, Kaldi, and ONNX:
|
## <a name="get-started"></a>Step 6: What's next?
|
||||||
```sh
|
|
||||||
sudo ./install_prerequisites.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Optional:** You can choose to install Model Optimizer support for only certain frameworks. In the same directory are individual scripts for Caffe, TensorFlow, MXNet, Kaldi, and ONNX (`install_prerequisites_caffe.sh`, etc.). If you see error messages, make sure you installed all dependencies.
|
Now you are ready to try out the toolkit.
|
||||||
|
|
||||||
The Model Optimizer is configured for one or more frameworks.
|
|
||||||
|
|
||||||
You have now completed all required installation, configuration, and build steps in this guide to use your CPU to work with your trained models.
|
Developing in Python:
|
||||||
|
* [Start with tensorflow models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/101-tensorflow-to-openvino-with-output.html)
|
||||||
|
* [Start with ONNX and PyTorch models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/102-pytorch-onnx-to-openvino-with-output.html)
|
||||||
|
* [Start with PaddlePaddle models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/103-paddle-onnx-to-openvino-classification-with-output.html)
|
||||||
|
|
||||||
To enable inference on other hardware, see below:
|
Developing in C++:
|
||||||
- <a href="#install-gpu">Steps for Intel® Processor Graphics (GPU)</a>
|
* [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README)
|
||||||
- <a href="#install-ncs2">Steps for Intel® Neural Compute Stick 2</a>
|
* [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README)
|
||||||
- <a href="#install-vpu">Steps for Intel® Vision Accelerator Design with Intel® Movidius™ VPUs</a><br>
|
* [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README)
|
||||||
|
|
||||||
Or proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
|
||||||
|
|
||||||
## <a name="optional-steps"></a>Step 5 (Optional): Configure Inference on non-CPU Devices:
|
|
||||||
|
|
||||||
### <a name="install-gpu"></a>Optional: Steps for Intel® Processor Graphics (GPU)
|
|
||||||
|
|
||||||
The steps in this section are required only if you want to enable the toolkit components to use processor graphics (GPU) on your system.
|
|
||||||
|
|
||||||
1. Go to the install_dependencies directory:
|
|
||||||
```sh
|
|
||||||
cd /opt/intel/openvino_2022/install_dependencies/
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Install the **Intel® Graphics Compute Runtime for OpenCL™** driver components required to use the GPU plugin and write custom layers for Intel® Integrated Graphics. The drivers are not included in the package. To install, run this script:
|
|
||||||
```sh
|
|
||||||
sudo -E ./install_NEO_OCL_driver.sh
|
|
||||||
```
|
|
||||||
> **NOTE**: To use the **Intel® Iris® Xe MAX Graphics**, see the [Intel® Iris® Xe MAX Graphics with Linux*](https://dgpu-docs.intel.com/devices/iris-xe-max-graphics/index.html) page for driver installation instructions.
|
|
||||||
|
|
||||||
The script compares the driver version on the system to the current version. If the driver version on the system is higher or equal to the current version, the script does
|
|
||||||
not install a new driver. If the version of the driver is lower than the current version, the script uninstalls the lower version and installs the current version with your permission:
|
|
||||||

|
|
||||||
|
|
||||||
Higher hardware versions require a higher driver version, namely 20.35 instead of 19.41. If the script fails to uninstall the driver, uninstall it manually. During the script execution, you may see the following command line output:
|
|
||||||
```sh
|
|
||||||
Add OpenCL user to video group
|
|
||||||
```
|
|
||||||
Ignore this suggestion and continue.<br>
|
|
||||||
You can also find the most recent version of the driver, installation procedure and other information on the [Intel® software for general purpose GPU capabilities](https://dgpu-docs.intel.com/index.html) site.
|
|
||||||
|
|
||||||
3. **Optional:** Install header files to allow compilation of new code. You can find the header files at [Khronos OpenCL™ API Headers](https://github.com/KhronosGroup/OpenCL-Headers.git).
|
|
||||||
|
|
||||||
You've completed all required configuration steps to perform inference on processor graphics.
|
|
||||||
Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
|
||||||
|
|
||||||
### <a name="install-ncs2"></a>Optional: Steps for Intel® Neural Compute Stick 2
|
|
||||||
|
|
||||||
These steps are only required if you want to perform inference on Intel® Movidius™ NCS powered by the Intel® Movidius™ Myriad™ 2 VPU or Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU. For more details, see the [Get Started page for Intel® Neural Compute Stick 2:](https://software.intel.com/en-us/neural-compute-stick/get-started).
|
|
||||||
|
|
||||||
1. Go to the install_dependencies directory:
|
|
||||||
```sh
|
|
||||||
cd /opt/intel/openvino_2022/install_dependencies/
|
|
||||||
```
|
|
||||||
2. Run the `install_NCS_udev_rules.sh` script:
|
|
||||||
```
|
|
||||||
./install_NCS_udev_rules.sh
|
|
||||||
```
|
|
||||||
3. You may need to reboot your machine for this to take effect.
|
|
||||||
|
|
||||||
You've completed all required configuration steps to perform inference on Intel® Neural Compute Stick 2.
|
|
||||||
Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
|
||||||
|
|
||||||
### <a name="install-vpu"></a>Optional: Steps for Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
|
|
||||||
|
|
||||||
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the [Intel® Vision Accelerator Design with Intel® Movidius™ VPUs Configuration Guide](installing-openvino-linux-ivad-vpu.md).
|
|
||||||
|
|
||||||
> **NOTE**: After installing your Intel® Movidius™ VPU, you will return to this guide to complete the Intel® Distribution of OpenVINO™ installation.
|
|
||||||
|
|
||||||
After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs:
|
|
||||||
|
|
||||||
1. Go to the **Inference Engine demo** directory:
|
|
||||||
```sh
|
|
||||||
cd /opt/intel/openvino_2022/samples/scripts
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Run the **Image Classification verification script**. If you have access to the Internet through the proxy server only, please make sure that it is configured in your OS environment.
|
|
||||||
```sh
|
|
||||||
./run_sample_squeezenet.sh -d HDDL
|
|
||||||
```
|
|
||||||
|
|
||||||
You've completed all required configuration steps to perform inference on Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
|
||||||
Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
|
||||||
|
|
||||||
## <a name="get-started"></a>Step 6: Start Using the Toolkit
|
|
||||||
|
|
||||||
Now you are ready to try out the toolkit. To continue, see the [Get Started Guide](../get_started.md) section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications with pre-trained models on different inference devices.
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
PRC developers might encounter pip errors during OpenVINO™ installation. To resolve the issues, try one of the following options:
|
|
||||||
* Add the download source using the `-i` parameter with the Python `pip` command. For example:
|
|
||||||
|
|
||||||
```
|
|
||||||
pip install numpy.py -i https://mirrors.aliyun.com/pypi/simple/
|
|
||||||
```
|
|
||||||
Use the `--trusted-host` parameter if the URL above is `http` instead of `https`.
|
|
||||||
|
|
||||||
* Modify or create the `~/.pip/pip.conf` file to change the default download source with the content below:
|
|
||||||
|
|
||||||
```
|
|
||||||
[global]
|
|
||||||
index-url = http://mirrors.aliyun.com/pypi/simple/
|
|
||||||
[install]
|
|
||||||
trusted-host = mirrors.aliyun.com
|
|
||||||
```
|
|
||||||
## <a name="uninstall"></a>Uninstall the Intel® Distribution of OpenVINO™ Toolkit
|
## <a name="uninstall"></a>Uninstall the Intel® Distribution of OpenVINO™ Toolkit
|
||||||
|
|
||||||
To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalling-openvino.md).
|
To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalling-openvino.md).
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
@sphinxdirective
|
||||||
|
.. raw:: html
|
||||||
|
|
||||||
|
<div class="collapsible-section">
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
PRC developers might encounter pip errors during Intel® Distribution of OpenVINO™ installation. To resolve the issues, try one of the following options:
|
||||||
|
* Add the download source using the `-i` parameter with the Python `pip` command. For example:
|
||||||
|
|
||||||
|
```
|
||||||
|
pip install openvino-dev -i https://mirrors.aliyun.com/pypi/simple/
|
||||||
|
```
|
||||||
|
Use the `--trusted-host` parameter if the URL above is `http` instead of `https`.
|
||||||
|
|
||||||
|
* If you run into incompatibility issues between components after installing new Intel® Distribution of OpenVINO™ version, try running `requirements.txt` with the following command:
|
||||||
|
|
||||||
|
```
|
||||||
|
pip install -r <INSTALL_DIR>/intel/openvino_2022/tools/requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
.. raw:: html
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
## Additional Resources
|
## Additional Resources
|
||||||
|
@sphinxdirective
|
||||||
|
.. raw:: html
|
||||||
|
|
||||||
- Get started with samples and demos: [Get Started Guide](../get_started.md)
|
<div class="collapsible-section">
|
||||||
- Intel® Distribution of OpenVINO™ toolkit home page: [https://software.intel.com/en-us/openvino-toolkit](https://software.intel.com/en-us/openvino-toolkit)
|
|
||||||
- Convert models for use with OpenVINO™: [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
|
|
||||||
- Write your own OpenVINO™ applications: [Inference Engine Developer Guide](../IE_DG/Deep_Learning_Inference_Engine_DevGuide.md)
|
|
||||||
- Information on sample applications: [Inference Engine Samples Overview](../IE_DG/Samples_Overview.md)
|
|
||||||
- Information on a supplied set of models: [Overview of OpenVINO™ Toolkit Pre-Trained Models](@ref omz_models_group_intel)
|
|
||||||
- IoT libraries and code samples: [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit)
|
|
||||||
|
|
||||||
To learn more about converting models from specific frameworks, go to:
|
@endsphinxdirective
|
||||||
|
- Convert models for use with OpenVINO™: [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
|
||||||
|
- Write your own OpenVINO™ applications: [Inference Engine Developer Guide](../IE_DG/Deep_Learning_Inference_Engine_DevGuide.md)
|
||||||
|
- Information on sample applications: [Inference Engine Samples Overview](../IE_DG/Samples_Overview.md)
|
||||||
|
- Information on a supplied set of models: [Overview of OpenVINO™ Toolkit Pre-Trained Models](../model_zoo.md)
|
||||||
|
- IoT libraries and code samples in the GitHUB repository: [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit)
|
||||||
|
|
||||||
|
To learn more about converting models from specific frameworks, go to:
|
||||||
|
|
||||||
|
- [Convert Your Caffe Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Caffe.md)
|
||||||
|
- [Convert Your TensorFlow Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_TensorFlow.md)
|
||||||
|
- [Convert Your MXNet Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_MxNet.md)
|
||||||
|
- [Convert Your Kaldi Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Kaldi.md)
|
||||||
|
- [Convert Your ONNX Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_ONNX.md)
|
||||||
|
|
||||||
- [Convert Your Caffe* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Caffe.md)
|
@sphinxdirective
|
||||||
- [Convert Your TensorFlow* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_TensorFlow.md)
|
.. raw:: html
|
||||||
- [Convert Your MXNet* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_MxNet.md)
|
|
||||||
- [Convert Your Kaldi* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Kaldi.md)
|
</div>
|
||||||
- [Convert Your ONNX* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_ONNX.md)
|
|
||||||
|
@endsphinxdirective
|
19
docs/install_guides/installing-openvino-windows-header.md
Normal file
19
docs/install_guides/installing-openvino-windows-header.md
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
# Install and Configure Intel® Distribution of OpenVINO™ toolkit for Windows* {#openvino_docs_install_guides_installing_openvino_windows_header}
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 2
|
||||||
|
:hidden:
|
||||||
|
|
||||||
|
Installer <openvino_docs_install_guides_installing_openvino_windows>
|
||||||
|
Docker <openvino_docs_install_guides_installing_openvino_docker_windows>
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
If you want to install Intel® Distribution of OpenVINO™ toolkit on your Windows machine, there are a few ways to accomplish this. We prepared following options for you:
|
||||||
|
|
||||||
|
* [Install Intel® Distribution of OpenVINO™ toolkit with an Installer](installing-openvino-windows.md)
|
||||||
|
* [Install Intel® Distribution of OpenVINO™ toolkit with Docker](installing-openvino-docker-windows.md)
|
||||||
|
|
||||||
|
Enjoy your journey with OpenVINO™.
|
25
docs/install_guides/installing-openvino-windows-ivad-vpu.md
Normal file
25
docs/install_guides/installing-openvino-windows-ivad-vpu.md
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
# Configuration Guide for the Intel® Distribution of OpenVINO™ toolkit and the Intel® Vision Accelerator Design with Intel® Movidius™ VPUs on Windows* {#openvino_docs_install_guides_installing_openvino_windows_ivad_vpu}
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _vpu guide windows:
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
|
||||||
|
To enable inference on Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, the following additional installation steps are required:
|
||||||
|
|
||||||
|
1. Download and install <a href="https://www.microsoft.com/en-us/download/details.aspx?id=48145">Visual C++ Redistributable for Visual Studio 2017</a>
|
||||||
|
2. Check with a support engineer if your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs card requires SMBUS connection to PCIe slot (most unlikely). Install the SMBUS driver only if confirmed (by default, it's not required):
|
||||||
|
1. Go to the `<INSTALL_DIR>\deployment_tools\inference-engine\external\hddl\drivers\SMBusDriver` directory, where `<INSTALL_DIR>` is the directory in which the Intel Distribution of OpenVINO™ toolkit is installed.
|
||||||
|
2. Right click on the `hddlsmbus.inf` file and choose **Install** from the pop up menu.
|
||||||
|
|
||||||
|
You are done installing your device driver and are ready to use your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||||
|
|
||||||
|
See also:
|
||||||
|
|
||||||
|
* For advanced configuration steps for your IEI Mustang-V100-MX8 accelerator, see [Intel® Movidius™ VPUs Setup Guide for Use with Intel® Distribution of OpenVINO™ toolkit](movidius-setup-guide.md).
|
||||||
|
|
||||||
|
* After you've configured your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see [Intel® Movidius™ VPUs Programming Guide for Use with Intel® Distribution of OpenVINO™ toolkit](movidius-programming-guide.md) to learn how to distribute a model across all 8 VPUs to maximize performance.
|
||||||
|
|
||||||
|
After configuration is done, you are ready to go to <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
@ -1,256 +1,158 @@
|
|||||||
# Install and Configure Intel® Distribution of OpenVINO™ toolkit for Windows* 10 {#openvino_docs_install_guides_installing_openvino_windows}
|
# Install and Configure Intel® Distribution of OpenVINO™ toolkit for Windows 10 {#openvino_docs_install_guides_installing_openvino_windows}
|
||||||
|
|
||||||
## Introduction
|
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. These tools are now only available on [pypi.org](https://pypi.org/project/openvino-dev/).
|
||||||
|
|
||||||
By default, the [OpenVINO™ Toolkit](https://docs.openvino.ai/latest/index.html) installation on this page installs the following components:
|
|
||||||
|
|
||||||
| Component | Description |
|
|
||||||
|-----------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
|
||||||
| [Model Optimizer](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md) | This tool imports, converts, and optimizes models that were trained in popular frameworks to a format usable by Intel tools, especially the Inference Engine. <br> Popular frameworks include Caffe\*, TensorFlow\*, MXNet\*, ONNX\* and Kaldi\*. |
|
|
||||||
| [Inference Engine](../IE_DG/Deep_Learning_Inference_Engine_DevGuide.md) | This is the engine that runs the deep learning model. It includes a set of libraries for an easy inference integration into your applications. |
|
|
||||||
| [OpenCV\*](https://docs.opencv.org/master/) | OpenCV\* community version compiled for Intel® hardware |
|
|
||||||
| [Inference Engine Code Samples](../IE_DG/Samples_Overview.md) | A set of simple command-line applications demonstrating how to utilize specific OpenVINO capabilities in an application and how to perform specific tasks, such as loading a model, running inference, querying specific device capabilities, and more. |
|
|
||||||
| [Demo Applications](@ref omz_demos) | A set of command-line applications that serve as robust templates to help you implement multi-stage pipelines and specific deep learning scenarios. |
|
|
||||||
| Additional Tools | A set of tools to work with your models including [Accuracy Checker utility](@ref omz_tools_accuracy_checker), [Post-Training Optimization Tool](@ref pot_README), [Model Downloader](@ref omz_tools_downloader) and others |
|
|
||||||
| [Documentation for Pre-Trained Models ](@ref omz_models_group_intel) | Documentation for the pre-trained models available in the [Open Model Zoo repo](https://github.com/openvinotoolkit/open_model_zoo). |
|
|
||||||
|
|
||||||
## System Requirements
|
## System Requirements
|
||||||
|
|
||||||
**Hardware**
|
@sphinxdirective
|
||||||
|
.. tab:: Operating Systems
|
||||||
|
|
||||||
Optimized for these processors:
|
Microsoft Windows 10, 64-bit
|
||||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
|
||||||
* 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
|
|
||||||
* Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
|
|
||||||
* Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1)
|
|
||||||
* Intel Pentium® processor N4200/5, N3350/5, or N3450/5 with Intel® HD Graphics
|
|
||||||
* Intel® Iris® Xe MAX Graphics
|
|
||||||
* Intel® Neural Compute Stick 2
|
|
||||||
* Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
|
|
||||||
|
|
||||||
> **NOTE**: Processor graphics are not included in all processors. See [Product Specifications](https://ark.intel.com/) for information about your processor.
|
.. tab:: Hardware
|
||||||
|
|
||||||
**Operating System**
|
Optimized for these processors:
|
||||||
|
|
||||||
- Microsoft Windows* 10, 64-bit
|
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||||
|
* 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
|
||||||
|
* Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
|
||||||
|
* Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1)
|
||||||
|
* Intel Pentium® processor N4200/5, N3350/5, or N3450/5 with Intel® HD Graphics
|
||||||
|
* Intel® Iris® Xe MAX Graphics
|
||||||
|
* Intel® Neural Compute Stick 2
|
||||||
|
* Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
|
||||||
|
|
||||||
|
.. tab:: Processor Notes
|
||||||
|
|
||||||
**Software**
|
Processor graphics are not included in all processors.
|
||||||
- [Microsoft Visual Studio 2019 with MSBuild](http://visualstudio.microsoft.com/downloads/)
|
See `Product Specifications`_ for information about your processor.
|
||||||
- [CMake 3.14 or higher, 64-bit](https://cmake.org/download/)
|
|
||||||
- [Python 3.6 - 3.8, 64-bit](https://www.python.org/downloads/windows/)
|
.. _Product Specifications: https://ark.intel.com/
|
||||||
> **IMPORTANT**: As part of this installation, make sure you click the option **Add Python 3.x to PATH** to [add Python](https://docs.python.org/3/using/windows.html#installation-steps) to your `PATH` environment variable.
|
|
||||||
|
.. tab:: Software
|
||||||
|
|
||||||
|
* `Microsoft Visual Studio 2019 with MSBuild <http://visualstudio.microsoft.com/downloads/>`_
|
||||||
|
* `CMake 3.14 or higher, 64-bit <https://cmake.org/download/>`_
|
||||||
|
* `Python 3.6 - 3.9, 64-bit <https://www.python.org/downloads/windows/>`_
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
You can choose to download Community version. Use `Microsoft Visual Studio installation guide <https://docs.microsoft.com/en-us/visualstudio/install/install-visual-studio?view=vs-2019>`_ to walk you through the installation. During installation in the **Workloads** tab, choose **Desktop development with C++**.
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
You can either use `cmake<version>.msi` which is the installation wizard or `cmake<version>.zip` where you have to go into the `bin` folder and then manually add the path to environmental variables.
|
||||||
|
|
||||||
|
.. important::
|
||||||
|
As part of this installation, make sure you click the option **Add Python 3.x to PATH** to `add Python <https://docs.python.org/3/using/windows.html#installation-steps>`_ to your `PATH` environment variable.
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
This guide provides step-by-step instructions on how to install the Intel® Distribution of OpenVINO™ toolkit. Links are provided for each type of compatible hardware including downloads, initialization and configuration steps. The following steps will be covered:
|
This guide provides step-by-step instructions on how to install the Intel® Distribution of OpenVINO™ toolkit. Links are provided for each type of compatible hardware including downloads, initialization and configuration steps. The following steps will be covered:
|
||||||
|
|
||||||
1. <a href="#install-external-dependencies">Install External Software Dependencies</a>
|
1. <a href="#install-openvino">Install the Intel® Distribution of OpenVINO™ Toolkit</a>
|
||||||
2. <a href="#install-openvino">Install the Intel® Distribution of OpenVINO™ Toolkit</a>
|
2. <a href="#set-the-environment-variables">Configure the Environment</a>
|
||||||
3. <a href="#set-the-environment-variables">Configure the Environment</a>
|
3. <a href="#model-optimizer">Download additional components (Optional)</a>
|
||||||
4. <a href="#model-optimizer">Configure the Model Optimizer</a>
|
4. <a href="#optional-steps">Configure Inference on non-CPU Devices (Optional)</a>
|
||||||
5. <a href="#optional-steps">Configure Inference on non-CPU Devices (Optional):</a>
|
5. <a href="#get-started">What's next?</a>
|
||||||
- <a href="#install-gpu">Steps for Intel® Processor Graphics (GPU)</a>
|
|
||||||
- <a href="#hddl-myriad">Steps for Intel® Vision Accelerator Design with Intel® Movidius™ VPU</a><br>
|
|
||||||
After installing your Intel® Movidius™ VPU, you will return to this guide to complete OpenVINO™ installation.<br>
|
|
||||||
5. <a href="#get-started">Start Using the Toolkit</a>
|
|
||||||
|
|
||||||
- [Steps to uninstall the Intel® Distribution of OpenVINO™ Toolkit](../install_guides/uninstalling-openvino.md)
|
## <a name="install-openvino"></a>Step 1: Install the Intel® Distribution of OpenVINO™ toolkit Core Components
|
||||||
|
|
||||||
## <a name="install-external-dependencies"></a>Step 1: Install External Software Dependencies
|
|
||||||
|
|
||||||
Install these dependencies:
|
|
||||||
|
|
||||||
1. [Microsoft Visual Studio* 2019 with MSBuild](http://visualstudio.microsoft.com/downloads/)
|
|
||||||
> **NOTE**: You can choose to download Community version. Use [Microsoft Visual Studio installation guide](https://docs.microsoft.com/en-us/visualstudio/install/install-visual-studio?view=vs-2019) to walk you through the installation. During installation in the **Workloads** tab, choose **Desktop development with C++**.
|
|
||||||
|
|
||||||
2. [CMake 3.14 or higher 64-bit](https://cmake.org/download/)
|
|
||||||
> **NOTE**: You can either use `cmake<version>.msi` which is the installation wizard or `cmake<version>.zip` where you have to go into the `bin` folder and then manually add the path to environmental variables.
|
|
||||||
|
|
||||||
3. [Python **3.6** - **3.8** 64-bit](https://www.python.org/downloads/windows/)
|
|
||||||
|
|
||||||
> **IMPORTANT**: As part of this installation, make sure you click the option **Add Python 3.x to PATH** to [add Python](https://docs.python.org/3/using/windows.html#installation-steps) to your `PATH` environment variable.
|
|
||||||
|
|
||||||
## <a name="install-openvino"></a>Step 2: Install the Intel® Distribution of OpenVINO™ toolkit Core Components
|
|
||||||
|
|
||||||
1. Download the Intel® Distribution of OpenVINO™ toolkit package file from [Intel® Distribution of OpenVINO™ toolkit for Windows*](https://software.intel.com/en-us/openvino-toolkit/choose-download).
|
1. Download the Intel® Distribution of OpenVINO™ toolkit package file from [Intel® Distribution of OpenVINO™ toolkit for Windows*](https://software.intel.com/en-us/openvino-toolkit/choose-download).
|
||||||
Select the Intel® Distribution of OpenVINO™ toolkit for Windows* package from the dropdown menu.
|
Select the Intel® Distribution of OpenVINO™ toolkit for Windows package from the dropdown menu.
|
||||||
|
|
||||||
2. Go to the `Downloads` folder and double-click `w_openvino_toolkit_p_<version>.exe`. A window opens to let you choose your installation directory and components.
|
2. Go to the `Downloads` folder and double-click `w_openvino_toolkit_p_<version>.exe`. In the opened window, you can select the folder where installer files will be placed. The directory will be referred to as <INSTALL_DIR> elsewhere in the documentation. Once the files are extracted, you should see the following dialog box open up:
|
||||||

|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
3. Follow the instructions on your screen. Watch for informational messages such as the following in case you must complete additional steps:
|
.. image:: _static/images/openvino-install.png
|
||||||

|
:width: 400px
|
||||||
|
:align: center
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
3. Follow the instructions on your screen. During the installation you will be asked to accept the license agreement. The acceptance is required to continue. Check out the installation process in the image below:<br>
|
||||||
|

|
||||||
|
Click on the image to see the details.
|
||||||
|
|
||||||
4. By default, the Intel® Distribution of OpenVINO™ is installed to the following directory, referred to as `<INSTALL_DIR>` elsewhere in the documentation: `C:\Program Files (x86)\Intel\openvino_<version>`. For simplicity, a shortcut to the latest installation is also created: `C:\Program Files (x86)\Intel\openvino_2022`.
|
The core components are now installed. Continue to the next section to configure environment.
|
||||||
|
|
||||||
5. **Optional**: You can choose **Customize** to change the installation directory or the components you want to install.
|
## <a name="set-the-environment-variables">Step 2: Configure the Environment
|
||||||
> **NOTE**: If there is an OpenVINO™ toolkit version previously installed on your system, the installer will use the same destination directory for next installations. If you want to install a newer version to a different directory, you need to uninstall the previously installed versions.
|
|
||||||
|
|
||||||
6. The **Finish** screen indicates that the core components have been installed:
|
|
||||||

|
|
||||||
|
|
||||||
7. Click **Finish** to close the installation wizard.
|
|
||||||
|
|
||||||
Once you click **Finish** to close the installation wizard, a new browser window opens with the document you’re reading now (in case you installed without it) and jumps to the section with the next installation steps.
|
|
||||||
|
|
||||||
The core components are now installed. Continue to the next section to install additional dependencies.
|
|
||||||
|
|
||||||
## <a name="set-the-environment-variables">Step 3: Configure the Environment
|
|
||||||
|
|
||||||
> **NOTE**: If you installed the Intel® Distribution of OpenVINO™ to a non-default install directory, replace `C:\Program Files (x86)\Intel` with that directory in this guide's instructions.
|
> **NOTE**: If you installed the Intel® Distribution of OpenVINO™ to a non-default install directory, replace `C:\Program Files (x86)\Intel` with that directory in this guide's instructions.
|
||||||
|
|
||||||
You must update several environment variables before you can compile and run OpenVINO™ applications. Open the Command Prompt, and run the `setupvars.bat` batch file to temporarily set your environment variables:
|
You must update several environment variables before you can compile and run OpenVINO™ applications. Open the Command Prompt, and run the `setupvars.bat` batch file to temporarily set your environment variables:
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
"C:\Program Files (x86)\Intel\openvino_2022\bin\setupvars.bat"
|
"<INSTALL_DIR>\openvino_2022\setupvars.bat"
|
||||||
```
|
```
|
||||||
|
|
||||||
> **IMPORTANT**: Windows PowerShell* is not recommended to run the configuration commands. Please use the command prompt (cmd) instead.
|
**Optional**: OpenVINO™ toolkit environment variables are removed when you close the command prompt window. You can permanently set the environment variables manually.
|
||||||
|
|
||||||
**Optional**: OpenVINO toolkit environment variables are removed when you close the command prompt window. As an option, you can permanently set the environment variables manually.
|
> **NOTE**: If you see an error indicating Python is not installed when you know you installed it, your computer might not be able to find the program. Check your system environment variables, and add Python if necessary.
|
||||||
|
|
||||||
> **NOTE**: If you see an error indicating Python is not installed when you know you installed it, your computer might not be able to find the program. For the instructions to add Python to your system environment variables, see <a href="#Update-Path">Update Your Windows Environment Variables</a>.
|
The environment variables are set. Next, you can download some additional tools.
|
||||||
|
|
||||||
The environment variables are set. Next, you will configure the Model Optimizer.
|
## <a name="model-optimizer">Step 3 (Optional): Download additional components
|
||||||
|
|
||||||
## <a name="model-optimizer">Step 4: Configure the Model Optimizer
|
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. The OpenVINO™ Model Development Tools can only be installed via PyPI now. See [Install OpenVINO™ Model Development Tools](@ref installing_model_dev_tools) for detailed steps.
|
||||||
|
|
||||||
The Model Optimizer is a Python\*-based command line tool for importing trained models from popular deep learning frameworks such as Caffe\*,
|
## <a name="optional-steps"></a>Step 4 (Optional): Configure Inference on non-CPU Devices
|
||||||
TensorFlow\*, Apache MXNet\*, ONNX\* and Kaldi\*.
|
|
||||||
|
|
||||||
The Model Optimizer is a key component of the Intel Distribution of OpenVINO toolkit. Performing inference on a model
|
@sphinxdirective
|
||||||
(with the exception of ONNX and nGraph models) requires running the model through the Model Optimizer. When you convert a pre-trained
|
.. tab:: GPU
|
||||||
model through the Model Optimizer, your output is an Intermediate Representation (IR) of the network. The Intermediate
|
|
||||||
Representation is a pair of files that describe the whole model:
|
|
||||||
|
|
||||||
- `.xml`: Describes the network topology
|
Only do this if you want to enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide <gpu guide windows>`.
|
||||||
- `.bin`: Contains the weights and biases binary data
|
|
||||||
|
|
||||||
For more information about the Model Optimizer, refer to the [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md).
|
.. tab:: VPU
|
||||||
|
|
||||||
If you see error messages, make sure you installed all dependencies. These steps use a command prompt to make sure you see error messages.
|
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPUs Configuration Guide <vpu guide windows>`.
|
||||||
|
|
||||||
1. Open a command prompt by typing `cmd` in your **Search Windows** box and then pressing **Enter**.
|
@endsphinxdirective
|
||||||
Type commands in the opened window:
|
|
||||||

|
|
||||||
|
|
||||||
2. Go to the Model Optimizer prerequisites directory.<br>
|
## <a name="get-started"></a>Step 5: What's next?
|
||||||
```sh
|
|
||||||
cd C:\Program Files (x86)\Intel\openvino_2022\tools\model_optimizer\install_prerequisites
|
|
||||||
```
|
|
||||||
|
|
||||||
3. Run this batch file to configure the Model Optimizer for Caffe, TensorFlow, MXNet, Kaldi\*, and ONNX:<br>
|
Now you are ready to try out the toolkit.
|
||||||
```sh
|
|
||||||
install_prerequisites.bat
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Optional:** You can choose to install Model Optimizer support for only certain frameworks. In the same directory are individual batch files for Caffe, TensorFlow, MXNet, Kaldi, and ONNX (`install_prerequisites_caffe.bat`, etc.). If you see error messages, make sure you installed all dependencies.
|
Developing in Python:
|
||||||
|
* [Start with tensorflow models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/101-tensorflow-to-openvino-with-output.html)
|
||||||
The Model Optimizer is configured for one or more frameworks.
|
* [Start with ONNX and PyTorch models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/102-pytorch-onnx-to-openvino-with-output.html)
|
||||||
|
* [Start with PaddlePaddle models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/103-paddle-onnx-to-openvino-classification-with-output.html)
|
||||||
You have now completed all required installation, configuration and build steps in this guide to use your CPU to work with your trained models.
|
|
||||||
|
|
||||||
If you want to use a GPU or VPU, or update your Windows* environment variables, read through the <a href="#optional-steps">Optional Steps</a> section:
|
|
||||||
|
|
||||||
- <a href="#install-gpu">Steps for Intel® Processor Graphics (GPU)</a>
|
|
||||||
- <a href="#hddl-myriad">Steps for Intel® Vision Accelerator Design with Intel® Movidius™ VPUs</a>
|
|
||||||
- <a href="#Update-Path">Add CMake* or Python* to your Windows* environment variables</a><br>
|
|
||||||
|
|
||||||
Or proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
|
||||||
|
|
||||||
## <a name="optional-steps"></a>Step 5 (Optional): Configure Inference on non-CPU Devices:
|
|
||||||
|
|
||||||
### <a name="install-gpu"></a>Optional: Steps for Intel® Processor Graphics (GPU)
|
|
||||||
|
|
||||||
> **NOTE**: These steps are required only if you want to use an Intel® integrated GPU.
|
|
||||||
|
|
||||||
This section will help you check if you require driver installation. Install indicated version or higher.
|
|
||||||
|
|
||||||
If your applications offload computation to **Intel® Integrated Graphics**, you must have the Intel Graphics Driver for Windows installed on your hardware.
|
|
||||||
[Download and install the recommended version](https://downloadcenter.intel.com/download/30079/Intel-Graphics-Windows-10-DCH-Drivers).
|
|
||||||
|
|
||||||
To check if you have this driver installed:
|
|
||||||
|
|
||||||
1. Type **device manager** in your **Search Windows** box and press Enter. The **Device Manager** opens.
|
|
||||||
|
|
||||||
2. Click the drop-down arrow to view the **Display adapters**. You can see the adapter that is installed in your computer:
|
|
||||||

|
|
||||||
|
|
||||||
3. Right-click the adapter name and select **Properties**.
|
|
||||||
|
|
||||||
4. Click the **Driver** tab to see the driver version.
|
|
||||||

|
|
||||||
|
|
||||||
You are done updating your device driver and are ready to use your GPU. Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
|
||||||
|
|
||||||
### <a name="hddl-myriad"></a> Optional: Steps for Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
|
|
||||||
|
|
||||||
> **NOTE**: These steps are required only if you want to use Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
|
||||||
|
|
||||||
To enable inference on Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, the following additional installation steps are required:
|
|
||||||
|
|
||||||
1. Download and install <a href="https://www.microsoft.com/en-us/download/details.aspx?id=48145">Visual C++ Redistributable for Visual Studio 2017</a>
|
|
||||||
2. Check with a support engineer if your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs card requires SMBUS connection to PCIe slot (most unlikely). Install the SMBUS driver only if confirmed (by default, it's not required):
|
|
||||||
1. Go to the `<INSTALL_DIR>\runtime\3rdparty\hddl\drivers\SMBusDriver` directory, where `<INSTALL_DIR>` is the directory in which the Intel Distribution of OpenVINO toolkit is installed.
|
|
||||||
2. Right click on the `hddlsmbus.inf` file and choose **Install** from the pop up menu.
|
|
||||||
|
|
||||||
You are done installing your device driver and are ready to use your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
|
||||||
|
|
||||||
See also:
|
|
||||||
|
|
||||||
* For advanced configuration steps for your IEI Mustang-V100-MX8 accelerator, see [Intel® Movidius™ VPUs Setup Guide for Use with Intel® Distribution of OpenVINO™ toolkit](movidius-setup-guide.md).
|
|
||||||
|
|
||||||
* After you've configurated your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see [Intel® Movidius™ VPUs Programming Guide for Use with Intel® Distribution of OpenVINO™ toolkit](movidius-programming-guide.md) to learn how to distribute a model across all 8 VPUs to maximize performance.
|
|
||||||
|
|
||||||
After configuration is done, you are ready to go to <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
|
||||||
|
|
||||||
## <a name="Update-Path"></a>Optional: Update Your Windows Environment Variables
|
|
||||||
|
|
||||||
> **NOTE**: These steps are only required under special circumstances, such as if you forgot to check the box during the CMake\* or Python\* installation to add the application to your Windows `PATH` environment variable.
|
|
||||||
|
|
||||||
Use these steps to update your Windows `PATH` if a command you execute returns an error message stating that an application cannot be found.
|
|
||||||
|
|
||||||
1. In your **Search Windows** box, type **Edit the system environment variables** and press **Enter**. A window like the following appears:
|
|
||||||

|
|
||||||
|
|
||||||
2. At the bottom of the screen, click **Environment Variables**.
|
|
||||||
|
|
||||||
3. Under **System variables**, click **Path** and then **Edit**:
|
|
||||||

|
|
||||||
|
|
||||||
4. In the opened window, click **Browse**. A browse window opens:
|
|
||||||

|
|
||||||
|
|
||||||
5. If you need to add CMake to the `PATH`, browse to the directory in which you installed CMake. The default directory is `C:\Program Files\CMake`.
|
|
||||||
|
|
||||||
6. If you need to add Python to the `PATH`, browse to the directory in which you installed Python. The default directory is `C:\Users\<USER_ID>\AppData\Local\Programs\Python\Python36\Python`. Note that the `AppData` folder is hidden by default. To view hidden files and folders, see these [Windows 10 instructions](https://support.microsoft.com/en-us/windows/view-hidden-files-and-folders-in-windows-10-97fbc472-c603-9d90-91d0-1166d1d9f4b5).
|
|
||||||
|
|
||||||
7. Click **OK** repeatedly to close each screen.
|
|
||||||
|
|
||||||
Your `PATH` environment variable is updated. If the changes don't take effect immediately, you may need to reboot.
|
|
||||||
|
|
||||||
## <a name="get-started"></a>Step 6: Start Using the Toolkit
|
|
||||||
|
|
||||||
Now you are ready to try out the toolkit. To continue, see the [Get Started Guide](../get_started.md) section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications with pre-trained models on different inference devices.
|
|
||||||
|
|
||||||
|
Developing in C++:
|
||||||
|
* [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README)
|
||||||
|
* [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README)
|
||||||
|
* [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README)
|
||||||
|
|
||||||
## <a name="uninstall"></a>Uninstall the Intel® Distribution of OpenVINO™ Toolkit
|
## <a name="uninstall"></a>Uninstall the Intel® Distribution of OpenVINO™ Toolkit
|
||||||
|
|
||||||
To uninstall the toolkit, follow the steps on the [Uninstalling](uninstalling-openvino.md) page.
|
To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalling-openvino.md).
|
||||||
|
|
||||||
## Additional Resources
|
## Additional Resources
|
||||||
|
@sphinxdirective
|
||||||
|
.. raw:: html
|
||||||
|
|
||||||
|
<div class="collapsible-section">
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
- Get started with samples and demos: [Get Started Guide](../get_started.md)
|
|
||||||
- Intel® Distribution of OpenVINO™ toolkit home page: [https://software.intel.com/en-us/openvino-toolkit](https://software.intel.com/en-us/openvino-toolkit)
|
|
||||||
- Convert models for use with OpenVINO™: [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
|
- Convert models for use with OpenVINO™: [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
|
||||||
- Write your own OpenVINO™ applications: [Inference Engine Developer Guide](../IE_DG/Deep_Learning_Inference_Engine_DevGuide.md)
|
- Write your own OpenVINO™ applications: [Inference Engine Developer Guide](../IE_DG/Deep_Learning_Inference_Engine_DevGuide.md)
|
||||||
- Information on sample applications: [Inference Engine Samples Overview](../IE_DG/Samples_Overview.md)
|
- Information on sample applications: [Inference Engine Samples Overview](../IE_DG/Samples_Overview.md)
|
||||||
- Information on a supplied set of models: [Overview of OpenVINO™ Toolkit Pre-Trained Models](@ref omz_models_group_intel)
|
- Information on a supplied set of models: [Overview of OpenVINO™ Toolkit Pre-Trained Models](../model_zoo.md)
|
||||||
- IoT libraries and code samples: [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit)
|
- IoT libraries and code samples: [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit)
|
||||||
|
|
||||||
To learn more about converting models from specific frameworks, go to:
|
To learn more about converting models from specific frameworks, go to:
|
||||||
|
|
||||||
- [Convert Your Caffe* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Caffe.md)
|
- [Convert Your Caffe Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Caffe.md)
|
||||||
- [Convert Your TensorFlow* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_TensorFlow.md)
|
- [Convert Your TensorFlow Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_TensorFlow.md)
|
||||||
- [Convert Your MXNet* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_MxNet.md)
|
- [Convert Your MXNet Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_MxNet.md)
|
||||||
- [Convert Your Kaldi* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Kaldi.md)
|
- [Convert Your Kaldi Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_Kaldi.md)
|
||||||
- [Convert Your ONNX* Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_ONNX.md)
|
- [Convert Your ONNX Model](../MO_DG/prepare_model/convert_model/Convert_Model_From_ONNX.md)
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
.. raw:: html
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
@endsphinxdirective
|
27
docs/install_guides/movidius-demos.md
Normal file
27
docs/install_guides/movidius-demos.md
Normal file
@ -0,0 +1,27 @@
|
|||||||
|
# Intel® Movidius™ VPU Demos for Use with Intel® Distribution of OpenVINO™ toolkit {#openvino_docs_install_guides_movidius_demos}
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _vpu demos:
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
Once you have your Intel® Distribution of OpenVINO™ Toolkit installed, and configured your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs [Intel® Vision Accelerator Design with Intel® Movidius™ VPUs Configuration Guide](installing-openvino-linux-ivad-vpu.md), you can run our demos:
|
||||||
|
|
||||||
|
1. Go to the **Inference Engine demo** directory:
|
||||||
|
```sh
|
||||||
|
cd <INSTALL_DIR>/intel/openvino_2022/deployment_tools/demo
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Run the **Image Classification verification script**. If you have access to the Internet through the proxy server only, please make sure that it is configured in your OS environment.
|
||||||
|
```sh
|
||||||
|
./demo_squeezenet_download_convert_run.sh -d HDDL
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Run the **Inference Pipeline verification script**:
|
||||||
|
```sh
|
||||||
|
./demo_security_barrier_camera.sh -d HDDL
|
||||||
|
```
|
||||||
|
|
||||||
|
You've completed all required configuration steps to perform inference on Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||||
|
Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
22
docs/install_guides/ncs2-setup-guide.md
Normal file
22
docs/install_guides/ncs2-setup-guide.md
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
# Intel® Neural Compute Stick 2 (NCS2) Setup Guide for Use with Intel® Distribution of OpenVINO™ toolkit {#openvino_docs_install_guides_ncs2_setup_guide}
|
||||||
|
|
||||||
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _ncs guide:
|
||||||
|
|
||||||
|
@endsphinxdirective
|
||||||
|
|
||||||
|
Once you have your Intel® Distribution of OpenVINO™ toolkit installed, follow the steps to be able to work on NCS2:
|
||||||
|
|
||||||
|
1. Go to the install_dependencies directory:
|
||||||
|
```sh
|
||||||
|
cd <INSTALL_DIR>/intel/openvino_2022/install_dependencies/
|
||||||
|
```
|
||||||
|
2. Run the `install_NCS_udev_rules.sh` script:
|
||||||
|
```
|
||||||
|
./install_NCS_udev_rules.sh
|
||||||
|
```
|
||||||
|
3. You may need to reboot your machine for this to take effect.
|
||||||
|
|
||||||
|
You've completed all required configuration steps to perform inference on Intel® Neural Compute Stick 2.
|
||||||
|
Proceed to the <a href="#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
@ -2,6 +2,8 @@
|
|||||||
|
|
||||||
@sphinxdirective
|
@sphinxdirective
|
||||||
|
|
||||||
|
.. _notebook tutorials:
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 2
|
:maxdepth: 2
|
||||||
:caption: Notebooks
|
:caption: Notebooks
|
||||||
|
Loading…
Reference in New Issue
Block a user