diff --git a/docs/get_started.md b/docs/get_started.md index 2045fdea4c3..aa86fd8e9e8 100644 --- a/docs/get_started.md +++ b/docs/get_started.md @@ -6,7 +6,7 @@ :maxdepth: 1 :hidden: :caption: Install OpenVINO - + Overview Install OpenVINO Runtime Install OpenVINO Development Tools @@ -18,24 +18,17 @@ :maxdepth: 1 :hidden: :caption: Additional Configurations - + Configurations for GPU Configurations for NCS2 Configurations for VPU Configurations for GNA - -.. toctree:: - :maxdepth: 1 - :hidden: - :caption: Troubleshooting - - Troubleshooting Guide - + .. toctree:: :maxdepth: 1 :hidden: :caption: Get Started Guides - + Get Started with Step-by-step Demo Get Started with Tutorials @@ -46,35 +39,37 @@ openvino_docs_IE_DG_Samples_Overview - +.. toctree:: + :maxdepth: 1 + :hidden: + :caption: Troubleshooting + + Installation & Configuration Issues + @endsphinxdirective - + @sphinxdirective .. raw:: html - + - +

To get started with OpenVINO, the first thing to do is to actually install it. You can get an overview of what installation options we provide and start from there.

- +

If you already have enough information, you can also choose the installation type that best suits your needs from one of the options below:
Install
OpenVINO Runtime
Install OpenVINO
Development Tools
Build
from source

-
- -

If you are using Intel® Processor Graphics, Intel® Vision Accelerator Design with Intel® Movidius™ VPUs or Intel® Neural Compute Stick 2, please check the additional configurations for them accordingly: Configurations for GPU, Configurations for VPU or Configurations for NCS2. +

+ +

If you are using Intel® Processor Graphics, Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, Intel® Neural Compute Stick 2 or Intel® Gaussian & Neural Accelerator (GNA), please check the additional configurations for them accordingly: Configurations for GPU, Configurations for VPU, Configurations for NCS2 or Configurations for GNA.

- +

With OpenVINO installed, you are ready to run your first inference and learn the workflow.
Here is a set of hands-on demonstrations of various complexity levels to guide you through the process: from performing sample inference with just one command, to running code samples, demo application or Jupyter notebooks. If you prefer working with GUI, you can also get started with the DL Workbench application. This way you can choose the right level for you.

- +

Choose how you want to progress:

- +
diff --git a/docs/install_guides/installing-model-dev-tools.md b/docs/install_guides/installing-model-dev-tools.md index 352b8b5483c..77300869886 100644 --- a/docs/install_guides/installing-model-dev-tools.md +++ b/docs/install_guides/installing-model-dev-tools.md @@ -10,6 +10,88 @@ If you want to download, convert, optimize and tune pre-trained deep learning mo > **NOTE**: From the 2022.1 release, the OpenVINO™ Development Tools can only be installed via PyPI. +## For Python Developers + +If you are a Python developer, you can find the main steps below to install OpenVINO Development Tools. For more details, see . + +While installing OpenVINO Development Tools, OpenVINO Runtime will also be installed as a dependency, so you don't need to install OpenVINO Runtime separately. + +### Step 1. Set Up Python Virtual Environment + +To avoid dependency conflicts, use a virtual environment. Skip this step only if you do want to install all dependencies globally. + +Use the following command to create a virtual environment: + +@sphinxdirective + +.. tab:: Linux and macOS + + .. code-block:: sh + + python3 -m venv openvino_env + +.. tab:: Windows + + .. code-block:: sh + + python -m venv openvino_env + + +@endsphinxdirective + + +### Step 2. Activate Virtual Environment + +@sphinxdirective + +.. tab:: Linux and macOS + + .. code-block:: sh + + source openvino_env/bin/activate + +.. tab:: Windows + + .. code-block:: sh + + openvino_env\Scripts\activate + + +@endsphinxdirective + + +### Step 3. Set Up and Update PIP to the Highest Version + +Use the following command: +```sh +python -m pip install --upgrade pip +``` + +### Step 4. Install the Package + +To install and configure the components of the development package for working with specific frameworks, use the following command: +``` +pip install openvino-dev[extras] +``` +where the `extras` parameter specifies one or more deep learning frameworks via these values: `caffe`, `kaldi`, `mxnet`, `onnx`, `pytorch`, `tensorflow`, `tensorflow2`. Make sure that you install the corresponding frameworks for your models. + +For example, to install and configure the components for working with TensorFlow 2.x and ONNX, use the following command: +``` +pip install openvino-dev[tensorflow2,onnx] +``` + +> **NOTE**: For TensorFlow, use the `tensorflow2` value as much as possible. The `tensorflow` value is provided only for compatibility reasons. + + +### Step 5. Verify the Installation + +To verify if the package is properly installed, run the command below (this may take a few seconds): +```sh +mo -h +``` +You will see the help message for Model Optimizer if installation finished successfully. + + ## For C++ Developers Note the following things: @@ -42,27 +124,24 @@ where the EXTRAS parameter specifies one or more deep learning frameworks via th If you have installed OpenVINO Runtime via the installer, to avoid version conflicts, specify your version in the command. For example: ``` -pip install openvino-dev[tensorflow2,mxnet,caffe]==2022.1 +pip install openvino-dev[tensorflow2,onnx]==2022.1 ``` > **NOTE**: For TensorFlow, use the `tensorflow2` value as much as possible. The `tensorflow` value is provided only for compatibility reasons. For more details, see . - -## For Python Developers +## What's Next? -You can use the following command to install the latest package version available in the index: -``` -pip install openvino-dev[EXTRAS] -``` -where the EXTRAS parameter specifies one or more deep learning frameworks via these values: `caffe`, `kaldi`, `mxnet`, `onnx`, `pytorch`, `tensorflow`, `tensorflow2`. Make sure that you install the corresponding frameworks for your models. +Now you may continue with the following tasks: -For example, to install and configure the components for working with TensorFlow 2.x, MXNet and Caffe, use the following command: -``` -pip install openvino-dev[tensorflow2,mxnet,caffe] -``` +* To convert models for use with OpenVINO, see [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). +* See pre-trained deep learning models in our [Open Model Zoo](../model_zoo.md). +* Try out OpenVINO via [OpenVINO Notebooks](https://docs.openvino.ai/latest/notebooks/notebooks.html). +* To write your own OpenVINO™ applications, see [OpenVINO Runtime User Guide](../OV_Runtime_UG/openvino_intro.md). +* See sample applications in [OpenVINO™ Toolkit Samples Overview](../OV_Runtime_UG/Samples_Overview.md). -> **NOTE**: For TensorFlow, use the `tensorflow2` value as much as possible. The `tensorflow` value is provided only for compatibility reasons. +## Additional Resources -For more details, see . +- Intel® Distribution of OpenVINO™ toolkit home page: +- For IoT Libraries & Code Samples, see [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit). diff --git a/docs/install_guides/installing-openvino-apt.md b/docs/install_guides/installing-openvino-apt.md index 5c8d1dcca52..452979dc0e4 100644 --- a/docs/install_guides/installing-openvino-apt.md +++ b/docs/install_guides/installing-openvino-apt.md @@ -32,16 +32,22 @@ The complete list of supported hardware is available in the [Release Notes](http > **NOTE**: You might need to install GnuPG: `sudo apt-get install gnupg` 2. Add the repository via the following command: + @sphinxdirective + + .. tab:: Ubuntu 18 + + .. code-block:: sh + + echo "deb https://apt.repos.intel.com/openvino/2022 bionic main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list + + .. tab:: Ubuntu 20 + + .. code-block:: sh + + echo "deb https://apt.repos.intel.com/openvino/2022 focal main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list + + @endsphinxdirective - * On Ubuntu 18 - ```sh - echo "deb https://apt.repos.intel.com/openvino/2022 bionic main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list - ``` - - * On Ubuntu 20 - ```sh - echo "deb https://apt.repos.intel.com/openvino/2022 focal main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list - ``` 3. Update the list of packages via the update command: ```sh @@ -106,7 +112,7 @@ sudo apt autoremove openvino-.. ### Step 3 (Optional): Install OpenCV from APT -OpenCV is necessary to run C++ demos from Open Model Zoo. Some C++ samples and demos also use OpenCV as a dependency. OpenVINO provides a package to install OpenCV from APT: +OpenCV is necessary to run C++ demos from Open Model Zoo. Some OpenVINO samples can also extend their capabilities when compiled with OpenCV as a dependency. OpenVINO provides a package to install OpenCV from APT: #### To Install the Latest Version of OpenCV @@ -126,19 +132,47 @@ sudo apt install openvino-opencv-.. After you have installed OpenVINO Runtime, if you decided to [install OpenVINO Development Tools](installing-model-dev-tools.md), make sure that you install external software dependencies first. -Refer to Install External Software Dependencies for detailed steps. +Refer to Install External Software Dependencies for detailed steps. -## Configurations for Non-CPU Devices +### Step 5 (Optional): Configure Inference on Non-CPU Devices -If you are using Intel® Processor Graphics, Intel® Vision Accelerator Design with Intel® Movidius™ VPUs or Intel® Neural Compute Stick 2, please follow the configuration steps in [Configurations for GPU](configurations-for-intel-gpu.md), [Configurations for VPU](installing-openvino-config-ivad-vpu.md) or [Configurations for NCS2](configurations-for-ncs2.md) accordingly. +@sphinxdirective +.. tab:: GNA + + To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide `. + +.. tab:: GPU + + To enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide `. + +.. tab:: NCS 2 + + To perform inference on Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU, follow the steps on :ref:`NCS2 Setup Guide `. + + +.. tab:: VPU + + To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide `. + After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs. + + .. warning:: + While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine. + +@endsphinxdirective + +## What's Next? + +Now you may continue with the following tasks: + +* To convert models for use with OpenVINO, see [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). +* See pre-trained deep learning models in our [Open Model Zoo](../model_zoo.md). +* Try out OpenVINO via [OpenVINO Notebooks](https://docs.openvino.ai/latest/notebooks/notebooks.html). +* To write your own OpenVINO™ applications, see [OpenVINO Runtime User Guide](../OV_Runtime_UG/openvino_intro.md). +* See sample applications in [OpenVINO™ Toolkit Samples Overview](../OV_Runtime_UG/Samples_Overview.md). ## Additional Resources - Intel® Distribution of OpenVINO™ toolkit home page: . -- OpenVINO™ toolkit online documentation: . -- [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). -- [OpenVINO Runtime User Guide](../OV_Runtime_UG/OpenVINO_Runtime_User_Guide). -- For more information on Sample Applications, see the [OpenVINO Samples Overview](../OV_Runtime_UG/Samples_Overview.md). - For IoT Libraries & Code Samples see the [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit). diff --git a/docs/install_guides/installing-openvino-conda.md b/docs/install_guides/installing-openvino-conda.md index 91cbfdd7bc0..fe0d5299afa 100644 --- a/docs/install_guides/installing-openvino-conda.md +++ b/docs/install_guides/installing-openvino-conda.md @@ -57,11 +57,18 @@ This guide provides installation steps for Intel® Distribution of OpenVINO™ t Now you can start developing your application. +## What's Next? + +Now you may continue with the following tasks: + +* To convert models for use with OpenVINO, see [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). +* See pre-trained deep learning models in our [Open Model Zoo](../model_zoo.md). +* Try out OpenVINO via [OpenVINO Notebooks](https://docs.openvino.ai/latest/notebooks/notebooks.html). +* To write your own OpenVINO™ applications, see [OpenVINO Runtime User Guide](../OV_Runtime_UG/openvino_intro.md). +* See sample applications in [OpenVINO™ Toolkit Samples Overview](../OV_Runtime_UG/Samples_Overview.md). + ## Additional Resources - Intel® Distribution of OpenVINO™ toolkit home page: . -- OpenVINO™ toolkit online documentation: . -- [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). -- [OpenVINO Runtime User Guide](../OV_Runtime_UG/OpenVINO_Runtime_User_Guide). -- For more information on Sample Applications, see the [OpenVINO Samples Overview](../OV_Runtime_UG/Samples_Overview.md). +- For IoT Libraries & Code Samples see the [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit). - Intel® Distribution of OpenVINO™ toolkit Anaconda home page: [https://anaconda.org/intel/openvino-ie4py](https://anaconda.org/intel/openvino-ie4py) diff --git a/docs/install_guides/installing-openvino-docker-linux.md b/docs/install_guides/installing-openvino-docker-linux.md index f09d3f0fdbe..a5421507fbe 100644 --- a/docs/install_guides/installing-openvino-docker-linux.md +++ b/docs/install_guides/installing-openvino-docker-linux.md @@ -26,7 +26,7 @@ This guide provides steps on creating a Docker image with Intel® Distribution o To launch a Linux image on WSL2 when trying to run inferences on a GPU, make sure that the following requirements are met: - Only Windows 10 with 21H2 update or above installed and Windows 11 are supported. - - Intel GPU driver on Windows host with version 30.0.100.9684 or above need be installed. Please see [this article](https://www.intel.com/content/www/us/en/artificial-intelligence/harness-the-power-of-intel-igpu-on-your-machine.html#articleparagraph_983312434) for more details. + - Intel GPU driver on Windows host with version 30.0.100.9684 or above need be installed. Please see :ref:`this article ` for more details. - From 2022.1 release, the Docker images contain preinstalled recommended version of OpenCL Runtime with WSL2 support. @endsphinxdirective diff --git a/docs/install_guides/installing-openvino-linux.md b/docs/install_guides/installing-openvino-linux.md index 105158855a6..77c4462c93c 100644 --- a/docs/install_guides/installing-openvino-linux.md +++ b/docs/install_guides/installing-openvino-linux.md @@ -17,7 +17,7 @@ Optimized for these processors: - * 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors + * 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors * 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake) * Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake) * Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1) @@ -28,9 +28,9 @@ .. tab:: Processor Notes - Processor graphics are not included in all processors. + Processor graphics are not included in all processors. See `Product Specifications`_ for information about your processor. - + .. _Product Specifications: https://ark.intel.com/ @endsphinxdirective @@ -74,11 +74,11 @@ This guide provides step-by-step instructions on how to install the Intel® Dist
You should see the following dialog box open up: @sphinxdirective - + .. image:: _static/images/openvino-install.png :width: 400px :align: center - + @endsphinxdirective Otherwise, you can add parameters `-a` for additional arguments and `--cli` to run installation in command line (CLI): @@ -86,7 +86,7 @@ This guide provides step-by-step instructions on how to install the Intel® Dist ./l_openvino_toolkit_p_.sh -a --cli ``` > **NOTE**: To get additional information on all parameters that can be used, use the help option: `--help`. Among others, you can find there `-s` option which offers silent mode, which together with `--eula approve` allows you to run whole installation with default values without any user inference. - + 6. Follow the instructions on your screen. During the installation you will be asked to accept the license agreement. Your acceptance is required to continue. Check the installation process on the image below:
![](../img/openvino-install-linux-run-boostrapper-script.gif) @@ -114,7 +114,7 @@ This script enables you to install Linux platform development tools and componen ```sh sudo -E ./install_openvino_dependencies.sh ``` - + Once the dependencies are installed, continue to the next section to set your environment variables. ## Step 3: Configure the Environment @@ -123,7 +123,7 @@ You must update several environment variables before you can compile and run Ope ```sh source /setupvars.sh -``` +``` If you have more than one OpenVINO™ version on your machine, you can easily switch its version by sourcing `setupvars.sh` of your choice. @@ -139,7 +139,7 @@ The environment variables are set. Next, you can download some additional tools. .. dropdown:: OpenCV - OpenCV is necessary to run demos from Open Model Zoo (OMZ). Some OpenVINO samples and demos also use OpenCV as a dependency. The Intel® Distribution of OpenVINO™ provides a script to install OpenCV: ``/extras/scripts/download_opencv.sh``. + OpenCV is necessary to run demos from Open Model Zoo (OMZ). Some OpenVINO samples can also extend their capabilities when compiled with OpenCV as a dependency. The Intel® Distribution of OpenVINO™ provides a script to install OpenCV: ``/extras/scripts/download_opencv.sh``. .. note:: Make sure you have 2 prerequisites installed: ``curl`` and ``tar``. @@ -153,8 +153,8 @@ The environment variables are set. Next, you can download some additional tools. @sphinxdirective .. tab:: GNA - Only if you want to enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide `. - + To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide `. + .. tab:: GPU To enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide `. @@ -167,7 +167,7 @@ The environment variables are set. Next, you can download some additional tools. .. tab:: VPU To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide `. - After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs. + After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs. .. warning:: While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine. @@ -188,7 +188,7 @@ Developing in C++: * [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README) * [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README) -## Uninstall the Intel® Distribution of OpenVINO™ Toolkit +## Uninstalling the Intel® Distribution of OpenVINO™ Toolkit To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalling-openvino.md). @@ -197,15 +197,15 @@ To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalli .. dropdown:: Troubleshooting PRC developers might encounter pip errors during Intel® Distribution of OpenVINO™ installation. To resolve the issues, try one of the following options: - - * Add the download source using the ``-i`` parameter with the Python ``pip`` command. For example: + + * Add the download source using the ``-i`` parameter with the Python ``pip`` command. For example: .. code-block:: sh pip install openvino-dev -i https://mirrors.aliyun.com/pypi/simple/ Use the ``--trusted-host`` parameter if the URL above is ``http`` instead of ``https``. - + * If you run into incompatibility issues between components after installing new Intel® Distribution of OpenVINO™ version, try running ``requirements.txt`` with the following command: .. code-block:: sh @@ -217,21 +217,21 @@ To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalli @sphinxdirective .. dropdown:: Additional Resources + + * Converting models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide ` + * Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide ` + * Sample applications: :ref:`OpenVINO™ Toolkit Samples Overview ` + * Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models ` + * IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_ - * Convert models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide ` - * Write your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide ` - * Information on sample applications: :ref:`OpenVINO™ Toolkit Samples Overview ` - * Information on a supplied set of models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models ` - * IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_ - - To learn more about converting models from specific frameworks, go to: - - * :ref:`Convert Your Caffe Model ` - * :ref:`Convert Your TensorFlow Model ` - * :ref:`Convert Your MXNet Model ` - * :ref:`Convert Your Kaldi Model ` - * :ref:`Convert Your ONNX Model ` - + .. _Intel® IoT Developer Kit: https://github.com/intel-iot-devkit @endsphinxdirective diff --git a/docs/install_guides/installing-openvino-macos.md b/docs/install_guides/installing-openvino-macos.md index 3f765e6f58f..383e56524b3 100644 --- a/docs/install_guides/installing-openvino-macos.md +++ b/docs/install_guides/installing-openvino-macos.md @@ -96,7 +96,7 @@ The environment variables are set. Continue to the next section if you want to d .. dropdown:: OpenCV - OpenCV is necessary to run demos from Open Model Zoo (OMZ). Some OpenVINO samples and demos also use OpenCV as a dependency. The Intel® Distribution of OpenVINO™ provides a script to install OpenCV: ``/extras/scripts/download_opencv.sh``. + OpenCV is necessary to run demos from Open Model Zoo (OMZ). Some OpenVINO samples can also extend their capabilities when compiled with OpenCV as a dependency. The Intel® Distribution of OpenVINO™ provides a script to install OpenCV: ``/extras/scripts/download_opencv.sh``. .. note:: Make sure you have 2 prerequisites installed: ``curl`` and ``tar``. @@ -142,20 +142,20 @@ To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalli .. dropdown:: Additional Resources - * Convert models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide ` - * Write your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide ` - * Information on sample applications: :ref:`OpenVINO™ Toolkit Samples Overview ` - * Information on a supplied set of models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models ` + * Converting models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide ` + * Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide ` + * Sample applications: :ref:`OpenVINO™ Toolkit Samples Overview ` + * Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models ` * IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_ - - To learn more about converting models from specific frameworks, go to: - - * :ref:`Convert Your Caffe Model ` - * :ref:`Convert Your TensorFlow Model ` - * :ref:`Convert Your MXNet Model ` - * :ref:`Convert Your Kaldi Model ` - * :ref:`Convert Your ONNX Model ` - + + .. _Intel® IoT Developer Kit: https://github.com/intel-iot-devkit @endsphinxdirective diff --git a/docs/install_guides/installing-openvino-pip.md b/docs/install_guides/installing-openvino-pip.md index 7b1b80201c1..ec1634d9a56 100644 --- a/docs/install_guides/installing-openvino-pip.md +++ b/docs/install_guides/installing-openvino-pip.md @@ -1,40 +1,97 @@ # Install Intel® Distribution of OpenVINO™ Toolkit from PyPI Repository {#openvino_docs_install_guides_installing_openvino_pip} -You can install Intel® Distribution of OpenVINO™ toolkit through the PyPI repository, including both OpenVINO™ Runtime and OpenVINO™ Development Tools. +You can install both OpenVINO™ Runtime and OpenVINO Development Tools through the PyPI repository. This page provides the main steps for installing OpenVINO Runtime. + +> **NOTE**: From the 2022.1 release, the OpenVINO™ Development Tools can only be installed via PyPI. See [Install OpenVINO Development Tools](installing-model-dev-tools.md) for detailed steps. ## Installing OpenVINO Runtime -The OpenVINO Runtime contains a set of libraries for an easy inference integration into your applications and supports heterogeneous execution across Intel® CPU and Intel® GPU hardware. To install OpenVINO Runtime, use the following command: +For system requirements and troubleshooting, see . + +### Step 1. Set Up Python Virtual Environment + +To avoid dependency conflicts, use a virtual environment. Skip this step only if you do want to install all dependencies globally. + +Use the following command to create a virtual environment: + +@sphinxdirective + +.. tab:: Linux and macOS + + .. code-block:: sh + + python3 -m venv openvino_env + +.. tab:: Windows + + .. code-block:: sh + + python -m venv openvino_env + + +@endsphinxdirective + +### Step 2. Activate Virtual Environment + +@sphinxdirective + +.. tab:: On Linux and macOS + + .. code-block:: sh + + source openvino_env/bin/activate + +.. tab:: On Windows + + .. code-block:: sh + + openvino_env\Scripts\activate + + +@endsphinxdirective + +### Step 3. Set Up and Update PIP to the Highest Version + +Use the following command: +```sh +python -m pip install --upgrade pip +``` + +### Step 4. Install the Package + +Use the following command: ``` pip install openvino ``` -For system requirements and more detailed steps, see . +### Step 5. Verify that the Package Is Installed +Run the command below: +```sh +python -c "from openvino.runtime import Core" +``` + +If installation was successful, you will not see any error messages (no console output). ## Installing OpenVINO Development Tools -OpenVINO Development Tools include Model Optimizer, Benchmark Tool, Accuracy Checker, Post-Training Optimization Tool and Open Model Zoo tools including Model Downloader. While installing OpenVINO Development Tools, OpenVINO Runtime will also be installed as a dependency, so you don't need to install OpenVINO Runtime separately. +OpenVINO Development Tools include Model Optimizer, Benchmark Tool, Accuracy Checker, Post-Training Optimization Tool and Open Model Zoo tools including Model Downloader. If you want to install OpenVINO Development Tools, OpenVINO Runtime will also be installed as a dependency, so you don't need to install OpenVINO Runtime separately. -Use the following command to install OpenVINO Development Tools: -``` -pip install openvino-dev[EXTRAS] -``` -where the EXTRAS parameter specifies one or more deep learning frameworks via these values: `caffe`, `kaldi`, `mxnet`, `onnx`, `pytorch`, `tensorflow`, `tensorflow2`. Make sure that you install the corresponding frameworks for your models. +See [Install OpenVINO™ Development Tools](installing-model-dev-tools.md) for detailed steps. -For example, to install and configure the components for working with TensorFlow 2.x, MXNet and Caffe, use the following command: -``` -pip install openvino-dev[tensorflow2,mxnet,caffe] -``` -> **NOTE**: For TensorFlow, use the `tensorflow2` value as much as possible. The `tensorflow` value is provided only for compatibility reasons. - -For system requirements and more detailed steps, see . +## What's Next? + +Now you may continue with the following tasks: + +* To convert models for use with OpenVINO, see [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). +* See pre-trained deep learning models in our [Open Model Zoo](../model_zoo.md). +* Try out OpenVINO via [OpenVINO Notebooks](https://docs.openvino.ai/latest/notebooks/notebooks.html). +* To write your own OpenVINO™ applications, see [OpenVINO Runtime User Guide](../OV_Runtime_UG/openvino_intro.md). +* See sample applications in [OpenVINO™ Toolkit Samples Overview](../OV_Runtime_UG/Samples_Overview.md). ## Additional Resources -- [Intel® Distribution of OpenVINO™ toolkit](https://software.intel.com/en-us/openvino-toolkit) -- [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md) -- [OpenVINO™ Runtime User Guide](../OV_Runtime_UG/openvino_intro.md) -- [OpenVINO Samples Overview](../OV_Runtime_UG/Samples_Overview.md) +- Intel® Distribution of OpenVINO™ toolkit home page: +- For IoT Libraries & Code Samples, see [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit). diff --git a/docs/install_guides/installing-openvino-windows.md b/docs/install_guides/installing-openvino-windows.md index 0b0ccb43427..0aa6244de66 100644 --- a/docs/install_guides/installing-openvino-windows.md +++ b/docs/install_guides/installing-openvino-windows.md @@ -13,7 +13,7 @@ Optimized for these processors: - * 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors + * 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors * 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake) * Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake) * Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1) @@ -21,12 +21,12 @@ * Intel® Iris® Xe MAX Graphics * Intel® Neural Compute Stick 2 * Intel® Vision Accelerator Design with Intel® Movidius™ VPUs - + .. tab:: Processor Notes - Processor graphics are not included in all processors. + Processor graphics are not included in all processors. See `Product Specifications`_ for information about your processor. - + .. _Product Specifications: https://ark.intel.com/ .. tab:: Software @@ -34,13 +34,13 @@ * `Microsoft Visual Studio 2019 with MSBuild `_ * `CMake 3.14 or higher, 64-bit `_ * `Python 3.6 - 3.9, 64-bit `_ - + .. note:: You can choose to download Community version. Use `Microsoft Visual Studio installation guide `_ to walk you through the installation. During installation in the **Workloads** tab, choose **Desktop development with C++**. .. note:: You can either use `cmake.msi` which is the installation wizard or `cmake.zip` where you have to go into the `bin` folder and then manually add the path to environmental variables. - + .. important:: As part of this installation, make sure you click the option **Add Python 3.x to PATH** to `add Python `_ to your `PATH` environment variable. @@ -53,24 +53,24 @@ This guide provides step-by-step instructions on how to install the Intel® Dist 1. Install the Intel® Distribution of OpenVINO™ Toolkit 2. Configure the Environment 3. Download additional components (Optional) -4. Configure Inference on non-CPU Devices (Optional) +4. Configure Inference on non-CPU Devices (Optional) 5. What's next? ## Step 1: Install the Intel® Distribution of OpenVINO™ toolkit Core Components 1. Download the Intel® Distribution of OpenVINO™ toolkit package file from [Intel® Distribution of OpenVINO™ toolkit for Windows](https://software.intel.com/en-us/openvino-toolkit/choose-download). Select the Intel® Distribution of OpenVINO™ toolkit for Windows package from the dropdown menu. - + 2. Go to the `Downloads` folder and double-click `w_openvino_toolkit_p_.exe`. In the opened window, you can select the folder where installer files will be placed. The directory will be referred to as elsewhere in the documentation. Once the files are extracted, you should see the following dialog box open up: @sphinxdirective - + .. image:: _static/images/openvino-install.png :width: 400px :align: center - + @endsphinxdirective - + 3. Follow the instructions on your screen. During the installation you will be asked to accept the license agreement. Your acceptance is required to continue. Check out the installation process in the image below:
![](../img/openvino-install-win-run-boostrapper-script.gif) Click on the image to see the details. @@ -107,30 +107,30 @@ The environment variables are set. Next, you can download some additional tools. .. dropdown:: OpenCV - OpenCV is necessary to run demos from Open Model Zoo (OMZ). Some OpenVINO samples and demos also use OpenCV as a dependency. The Intel® Distribution of OpenVINO™ provides a script to install OpenCV: ``/extras/scripts/download_opencv.sh``. + OpenCV is necessary to run demos from Open Model Zoo (OMZ). Some OpenVINO samples can also extend their capabilities when compiled with OpenCV as a dependency. The Intel® Distribution of OpenVINO™ provides a script to install OpenCV: ``/extras/scripts/download_opencv.sh``. .. note:: No prerequisites are needed. - + There are three ways to run the script: - + * GUI: right-click the script and select ``Run with PowerShell``. - + * Command prompt (CMD) console: - + .. code-block:: sh - + powershell \extras\scripts\download_opencv.ps1 - - + + * PowerShell console: - + .. code-block:: sh + + .\\scripts\download_opencv.ps1 + - .\\scripts\download_opencv.ps1 - - - If the Intel® Distribution of OpenVINO™ is installed to the system location (e.g. ``Program Files (x86)``) then privilege elevation dialog will be shown. The script can be run from CMD/PowerShell Administrator console to avoid this dialog in case of system-wide installation. + If the Intel® Distribution of OpenVINO™ is installed to the system location (e.g. ``Program Files (x86)``) then privilege elevation dialog will be shown. The script can be run from CMD/PowerShell Administrator console to avoid this dialog in case of system-wide installation. The script is interactive by default, so during the execution it will wait for user to press ``Enter`` If you want to avoid this, use the ``-batch`` option, e.g. ``powershell \extras\scripts\download_opencv.ps1 -batch``. After the execution of the script, you will find OpenCV extracted to ``/extras/opencv``. @endsphinxdirective @@ -140,8 +140,8 @@ The environment variables are set. Next, you can download some additional tools. @sphinxdirective .. tab:: GNA - Only if you want to enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide `. - + To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide `. + .. tab:: GPU To enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide `. @@ -149,6 +149,11 @@ The environment variables are set. Next, you can download some additional tools. .. tab:: VPU To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide `. + +.. tab:: NCS 2 + + No additional configurations are needed. + @endsphinxdirective @@ -165,29 +170,29 @@ Developing in C++: * [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README) * [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README) * [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README) - -## Uninstall the Intel® Distribution of OpenVINO™ Toolkit + +## Uninstalling the Intel® Distribution of OpenVINO™ Toolkit To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalling-openvino.md). @sphinxdirective .. dropdown:: Additional Resources - - * Convert models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide ` - * Write your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide ` - * Information on sample applications: :ref:`OpenVINO™ Toolkit Samples Overview ` - * Information on a supplied set of models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models ` - * IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_ - - To learn more about converting models from specific frameworks, go to: - - * :ref:`Convert Your Caffe Model ` - * :ref:`Convert Your TensorFlow Model ` - * :ref:`Convert Your MXNet Model ` - * :ref:`Convert Your Kaldi Model ` - * :ref:`Convert Your ONNX Model ` - + + * Converting models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide ` + * Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide ` + * Sample applications: :ref:`OpenVINO™ Toolkit Samples Overview ` + * Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models ` + * IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_ + + .. _Intel® IoT Developer Kit: https://github.com/intel-iot-devkit @endsphinxdirective diff --git a/docs/install_guides/installing-openvino-yum.md b/docs/install_guides/installing-openvino-yum.md index 35f7211b3ae..4e0204a0616 100644 --- a/docs/install_guides/installing-openvino-yum.md +++ b/docs/install_guides/installing-openvino-yum.md @@ -96,7 +96,7 @@ sudo yum autoremove openvino-.. ### Step 3 (Optional): Install OpenCV from YUM -OpenCV is necessary to run C++ demos from Open Model Zoo. Some C++ samples and demos also use OpenCV as a dependency. OpenVINO provides a package to install OpenCV from YUM: +OpenCV is necessary to run C++ demos from Open Model Zoo. Some OpenVINO samples can also extend their capabilities when compiled with OpenCV as a dependency. OpenVINO provides a package to install OpenCV from YUM: #### To Install the Latest Version of OpenCV @@ -116,18 +116,47 @@ sudo yum install openvino-opencv-.. After you have installed OpenVINO Runtime, if you decided to [install OpenVINO Model Development Tools](installing-model-dev-tools.md), make sure that you install external software dependencies first. -Refer to Install External Software Dependencies for detailed steps. +Refer to Install External Software Dependencies for detailed steps. -## Configurations for Non-CPU Devices +### Step 5 (Optional): Configure Inference on Non-CPU Devices -If you are using Intel® Processor Graphics, Intel® Vision Accelerator Design with Intel® Movidius™ VPUs or Intel® Neural Compute Stick 2, please follow the configuration steps in [Configurations for GPU](configurations-for-intel-gpu.md), [Configurations for VPU](installing-openvino-config-ivad-vpu.md) or [Configurations for NCS2](configurations-for-ncs2.md) accordingly. +@sphinxdirective +.. tab:: GNA + + To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide `. + +.. tab:: GPU + + To enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide `. + +.. tab:: NCS 2 + + To perform inference on Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU, follow the steps on :ref:`NCS2 Setup Guide `. + + +.. tab:: VPU + + To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide `. + After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs. + + .. warning:: + While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine. + +@endsphinxdirective + + +## What's Next? + +Now you may continue with the following tasks: + +* To convert models for use with OpenVINO, see [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). +* See pre-trained deep learning models in our [Open Model Zoo](../model_zoo.md). +* Try out OpenVINO via [OpenVINO Notebooks](https://docs.openvino.ai/latest/notebooks/notebooks.html). +* To write your own OpenVINO™ applications, see [OpenVINO Runtime User Guide](../OV_Runtime_UG/openvino_intro.md). +* See sample applications in [OpenVINO™ Toolkit Samples Overview](../OV_Runtime_UG/Samples_Overview.md). ## Additional Resources -- Intel® Distribution of OpenVINO™ toolkit home page: . -- OpenVINO™ toolkit online documentation: . -- [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md). -- [OpenVINO Runtime User Guide](../OV_Runtime_UG/OpenVINO_Runtime_User_Guide). -- For more information on Sample Applications, see the [OpenVINO Samples Overview](../OV_Runtime_UG/Samples_Overview.md). -- For IoT Libraries & Code Samples see the [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit). +- Intel® Distribution of OpenVINO™ toolkit home page: +- For IoT Libraries & Code Samples, see [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit). diff --git a/docs/install_guides/troubleshooting.md b/docs/install_guides/troubleshooting.md index a651ef101fb..caf9225b60c 100644 --- a/docs/install_guides/troubleshooting.md +++ b/docs/install_guides/troubleshooting.md @@ -1,4 +1,4 @@ -# Troubleshooting {#openvino_docs_get_started_guide_troubleshooting} +# Troubleshooting Issues with OpenVINO™ Installation & Configuration {#openvino_docs_get_started_guide_troubleshooting}