This reverts commit b390f384b6.
This commit is contained in:
@@ -4,7 +4,7 @@
|
||||
|
||||
|
||||
The IEI Mustang-V100-MX8 is an OEM version of the Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||
This guide assumes you have installed the [Mustang-V100-MX8](https://download.ieiworld.com/) and OpenVINO™ Runtime.
|
||||
This guide assumes you have installed the [Mustang-V100-MX8](https://download.ieiworld.com/) and Intel® Distribution of OpenVINO™ toolkit.
|
||||
|
||||
Instructions in this guide for configuring your accelerator include:
|
||||
1. Installing the required IEI BSL reset software
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
# Configurations for Intel® Gaussian & Neural Accelerator (GNA) with OpenVINO™ {#openvino_docs_install_guides_configurations_for_intel_gna}
|
||||
# Configurations for Intel® Gaussian & Neural Accelerator (GNA) with Intel® Distribution of OpenVINO™ toolkit {#openvino_docs_install_guides_configurations_for_intel_gna}
|
||||
|
||||
This page introduces additional configurations for Intel® Gaussian & Neural Accelerator (GNA) with Intel® Distribution of OpenVINO™ toolkit on Linux and Windows.
|
||||
|
||||
> **NOTE**: On platforms where Intel® GNA is not enabled in the BIOS, the driver cannot be installed, so the GNA plugin uses the software emulation mode only.
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Configurations for Intel® Processor Graphics (GPU) with OpenVINO™ {#openvino_docs_install_guides_configurations_for_intel_gpu}
|
||||
# Configurations for Intel® Processor Graphics (GPU) with Intel® Distribution of OpenVINO™ toolkit {#openvino_docs_install_guides_configurations_for_intel_gpu}
|
||||
|
||||
|
||||
@sphinxdirective
|
||||
@@ -8,6 +8,7 @@
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
This page introduces additional configurations for Intel® Processor Graphics (GPU) with Intel® Distribution of OpenVINO™ toolkit on Linux and Windows.
|
||||
|
||||
## Linux
|
||||
|
||||
@@ -23,10 +24,10 @@ If you have installed OpenVINO Runtime via the installer, APT, YUM, or archive f
|
||||
sudo -E ./install_NEO_OCL_driver.sh
|
||||
```
|
||||
> **NOTE**: To use the **Intel® Iris® Xe MAX Graphics**, see the [Intel® Iris® Xe MAX Graphics with Linux*](https://dgpu-docs.intel.com/devices/iris-xe-max-graphics/index.html) page for driver installation instructions.
|
||||
|
||||
The script compares the driver version on the system to the current version. If the driver version on the system is higher or equal to the current version, the script does
|
||||
|
||||
The script compares the driver version on the system to the current version. If the driver version on the system is higher or equal to the current version, the script does
|
||||
not install a new driver. If the version of the driver is lower than the current version, the script uninstalls the lower version and installs the current version with your permission:
|
||||

|
||||

|
||||
|
||||
Higher hardware versions require a higher driver version, namely 20.35 instead of 19.41. If the script fails to uninstall the driver, uninstall it manually. During the script execution, you may see the following command line output:
|
||||
```sh
|
||||
@@ -37,7 +38,7 @@ If you have installed OpenVINO Runtime via the installer, APT, YUM, or archive f
|
||||
|
||||
3. **Optional:** Install header files to allow compilation of new code. You can find the header files at [Khronos OpenCL™ API Headers](https://github.com/KhronosGroup/OpenCL-Headers.git).
|
||||
|
||||
You've completed all required configuration steps to perform inference on processor graphics.
|
||||
You've completed all required configuration steps to perform inference on processor graphics.
|
||||
Proceed to the <a href="openvino_docs_install_guides_installing_openvino_linux.html#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
||||
|
||||
@sphinxdirective
|
||||
@@ -51,7 +52,7 @@ Proceed to the <a href="openvino_docs_install_guides_installing_openvino_linux.h
|
||||
This section will help you check if you require driver installation. Install indicated version or higher.
|
||||
|
||||
If your applications offload computation to **Intel® Integrated Graphics**, you must have the Intel Graphics Driver for Windows installed on your hardware.
|
||||
[Download and install the recommended version](https://downloadcenter.intel.com/download/30079/Intel-Graphics-Windows-10-DCH-Drivers).
|
||||
[Download and install the recommended version](https://downloadcenter.intel.com/download/30079/Intel-Graphics-Windows-10-DCH-Drivers).
|
||||
|
||||
To check if you have this driver installed:
|
||||
|
||||
@@ -62,20 +63,8 @@ To check if you have this driver installed:
|
||||
|
||||
3. Right-click the adapter name and select **Properties**.
|
||||
|
||||
4. Click the **Driver** tab to see the driver version.
|
||||
4. Click the **Driver** tab to see the driver version.
|
||||

|
||||
|
||||
You are done updating your device driver and are ready to use your GPU.
|
||||
## What’s Next?
|
||||
You are done updating your device driver and are ready to use your GPU. Proceed to the <a href="openvino_docs_install_guides_installing_openvino_windows.html#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
||||
|
||||
You can try out the toolkit with:
|
||||
|
||||
Developing in Python:
|
||||
* [Start with tensorflow models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/101-tensorflow-to-openvino-with-output.html)
|
||||
* [Start with ONNX and PyTorch models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/102-pytorch-onnx-to-openvino-with-output.html)
|
||||
* [Start with PaddlePaddle models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/103-paddle-onnx-to-openvino-classification-with-output.html)
|
||||
|
||||
Developing in C++:
|
||||
* [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README)
|
||||
* [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README)
|
||||
* [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README)
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
:hidden:
|
||||
|
||||
IEI Mustang-V100-MX8-R10 Card <openvino_docs_install_guides_movidius_setup_guide>
|
||||
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
@@ -21,7 +21,7 @@ For troubleshooting issues, please see the [Troubleshooting Guide](troubleshooti
|
||||
|
||||
For Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, the following additional installation steps are required.
|
||||
|
||||
> **NOTE**: If you installed OpenVINO™ Runtime to the non-default install directory, replace `/opt/intel` with the directory in which you installed the software.
|
||||
> **NOTE**: If you installed the Intel® Distribution of OpenVINO™ to the non-default install directory, replace `/opt/intel` with the directory in which you installed the software.
|
||||
|
||||
1. Set the environment variables:
|
||||
```sh
|
||||
@@ -33,18 +33,18 @@ source /opt/intel/openvino_2022/setupvars.sh
|
||||
```sh
|
||||
${HDDL_INSTALL_DIR}/install_IVAD_VPU_dependencies.sh
|
||||
```
|
||||
Note, if the Linux kernel is updated after the installation, it is required to install drivers again:
|
||||
Note, if the Linux kernel is updated after the installation, it is required to install drivers again:
|
||||
```sh
|
||||
cd ${HDDL_INSTALL_DIR}/drivers
|
||||
```
|
||||
```sh
|
||||
sudo ./setup.sh install
|
||||
```
|
||||
Now the dependencies are installed and you are ready to use the Intel® Vision Accelerator Design with Intel® Movidius™ with OpenVINO™.
|
||||
Now the dependencies are installed and you are ready to use the Intel® Vision Accelerator Design with Intel® Movidius™ with the Intel® Distribution of OpenVINO™ toolkit.
|
||||
|
||||
### Optional Steps
|
||||
|
||||
For advanced configuration steps for your **IEI Mustang-V100-MX8-R10** accelerator, see [Configurations for IEI Mustang-V100-MX8-R10 card](configurations-for-iei-card.md). **IEI Mustang-V100-MX8-R11** accelerator doesn't require any additional steps.
|
||||
For advanced configuration steps for your **IEI Mustang-V100-MX8-R10** accelerator, see [Configurations for IEI Mustang-V100-MX8-R10 card](configurations-for-iei-card.md). **IEI Mustang-V100-MX8-R11** accelerator doesn't require any additional steps.
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
@@ -59,23 +59,11 @@ To enable inference on Intel® Vision Accelerator Design with Intel® Movidius
|
||||
|
||||
1. Download and install <a href="https://www.microsoft.com/en-us/download/details.aspx?id=48145">Visual C++ Redistributable for Visual Studio 2017</a>
|
||||
2. Check with a support engineer if your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs card requires SMBUS connection to PCIe slot (most unlikely). Install the SMBUS driver only if confirmed (by default, it's not required):
|
||||
1. Go to the `<INSTALL_DIR>\runtime\3rdparty\hddl\drivers\SMBusDriver` directory, where `<INSTALL_DIR>` is the directory in which OpenVINO™ Runtime is installed.
|
||||
1. Go to the `<INSTALL_DIR>\runtime\3rdparty\hddl\drivers\SMBusDriver` directory, where `<INSTALL_DIR>` is the directory in which the Intel Distribution of OpenVINO toolkit is installed.
|
||||
2. Right click on the `hddlsmbus.inf` file and choose **Install** from the pop up menu.
|
||||
|
||||
You are done installing your device driver and are ready to use your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||
|
||||
For advanced configuration steps for your IEI Mustang-V100-MX8 accelerator, see [Configurations for IEI Mustang-V100-MX8-R10 card](configurations-for-iei-card.md).
|
||||
|
||||
## What’s Next?
|
||||
|
||||
After configuration is done, you are ready to try out OpenVINO™.
|
||||
|
||||
Developing in Python:
|
||||
* [Start with tensorflow models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/101-tensorflow-to-openvino-with-output.html)
|
||||
* [Start with ONNX and PyTorch models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/102-pytorch-onnx-to-openvino-with-output.html)
|
||||
* [Start with PaddlePaddle models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/103-paddle-onnx-to-openvino-classification-with-output.html)
|
||||
|
||||
Developing in C++:
|
||||
* [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README)
|
||||
* [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README)
|
||||
* [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README)
|
||||
After configuration is done, you are ready to go to <a href="openvino_docs_install_guides_installing_openvino_windows.html#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
||||
|
||||
@@ -6,10 +6,9 @@
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
## Linux
|
||||
|
||||
Once you have OpenVINO™ Runtime installed, follow these steps to be able to work on NCS2:
|
||||
Once you have your Intel® Distribution of OpenVINO™ toolkit installed, follow the steps to be able to work on NCS2:
|
||||
|
||||
1. Go to the install_dependencies directory:
|
||||
```sh
|
||||
@@ -21,8 +20,8 @@ Once you have OpenVINO™ Runtime installed, follow these steps to be able to wo
|
||||
```
|
||||
3. You may need to reboot your machine for this to take effect.
|
||||
|
||||
You've completed all required configuration steps to perform inference on Intel® Neural Compute Stick 2.
|
||||
Proceed to the [Get Started Guide](@ref get_started) section to learn the basic OpenVINO™ workflow and run code samples and demo applications.
|
||||
You've completed all required configuration steps to perform inference on Intel® Neural Compute Stick 2.
|
||||
Proceed to the [Get Started Guide](@ref get_started) section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
@@ -47,7 +46,7 @@ Proceed to the [Get Started Guide](@ref get_started) section to learn the basic
|
||||
```
|
||||
4. Plug in your Intel® Neural Compute Stick 2.
|
||||
|
||||
5. (Optional) If you want to compile and run the Image Classification sample to verify the installation of OpenVINO, follow the steps below.
|
||||
5. (Optional) If you want to compile and run the Image Classification sample to verify the OpenVINO™ toolkit installation follow the next steps.
|
||||
|
||||
a. Navigate to a directory that you have write access to and create a samples build directory. This example uses a directory named `build`:
|
||||
```sh
|
||||
@@ -66,7 +65,7 @@ Proceed to the [Get Started Guide](@ref get_started) section to learn the basic
|
||||
cd open_model_zoo/tools/model_tools
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install -r requirements.in
|
||||
python3 downloader.py --name squeezenet1.1
|
||||
python3 downloader.py --name squeezenet1.1
|
||||
```
|
||||
d. Run the sample specifying the model, a path to the input image, and the VPU required to run with the Raspbian OS:
|
||||
```sh
|
||||
@@ -92,17 +91,4 @@ brew install libusb
|
||||
```
|
||||
|
||||
You've completed all required configuration steps to perform inference on your Intel® Neural Compute Stick 2.
|
||||
|
||||
## What’s Next?
|
||||
|
||||
Now you are ready to try out OpenVINO™. You can use the following tutorials to write your applications using Python and C++.
|
||||
|
||||
Developing in Python:
|
||||
* [Start with tensorflow models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/101-tensorflow-to-openvino-with-output.html)
|
||||
* [Start with ONNX and PyTorch models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/102-pytorch-onnx-to-openvino-with-output.html)
|
||||
* [Start with PaddlePaddle models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/103-paddle-onnx-to-openvino-classification-with-output.html)
|
||||
|
||||
Developing in C++:
|
||||
* [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README)
|
||||
* [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README)
|
||||
* [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README)
|
||||
Proceed to the <a href="openvino_docs_install_guides_installing_openvino_macos.html#get-started">Start Using the Toolkit</a> section to learn the basic OpenVINO™ toolkit workflow and run code samples and demo applications.
|
||||
|
||||
@@ -16,9 +16,10 @@
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
After you have installed OpenVINO™ Runtime, you may also need do some additional configurations for your device to work with OpenVINO™. See the following pages:
|
||||
After you have installed Intel® Distribution of OpenVINO™ toolkit, you may also need do some additional configurations for your device to work with OpenVINO. See the following pages:
|
||||
|
||||
* [Configurations for GPU](configurations-for-intel-gpu.md)
|
||||
* [Configurations for NCS2](configurations-for-ncs2.md)
|
||||
* [Configurations for VPU](configurations-for-ivad-vpu.md)
|
||||
* [Configurations for GNA](configurations-for-intel-gna.md)
|
||||
|
||||
|
||||
@@ -1,7 +1,186 @@
|
||||
# Install OpenVINO™ Runtime for Linux Using APT Repository {#openvino_docs_install_guides_installing_openvino_apt}
|
||||
# Install Intel® Distribution of OpenVINO™ Toolkit for Linux Using APT Repository {#openvino_docs_install_guides_installing_openvino_apt}
|
||||
|
||||
This guide provides detailed steps for installing OpenVINO™ Runtime through the APT repository and guidelines for installing OpenVINO Development Tools.
|
||||
|
||||
> **NOTE**: From the 2022.1 release, OpenVINO™ Development Tools can be installed via PyPI only. See [Install OpenVINO Development Tools](#installing-openvino-development-tools) for more information.
|
||||
|
||||
The other installation methods are temporarily unavailable.
|
||||
> **IMPORTANT**: By downloading and using this container and the included software, you agree to the terms and conditions of the [software license agreements](https://software.intel.com/content/dam/develop/external/us/en/documents/intel-openvino-license-agreements.pdf). Please review the content inside the `<INSTALL_DIR>/licensing` folder for more details.
|
||||
|
||||
## System Requirements
|
||||
|
||||
The complete list of supported hardware is available in the [Release Notes](https://software.intel.com/content/www/us/en/develop/articles/openvino-relnotes.html).
|
||||
|
||||
**Operating Systems**
|
||||
|
||||
- Ubuntu 18.04 long-term support (LTS), 64-bit
|
||||
- Ubuntu 20.04 long-term support (LTS), 64-bit
|
||||
|
||||
**Software**
|
||||
|
||||
- [CMake 3.13 or higher, 64-bit](https://cmake.org/download/)
|
||||
- GCC 7.5.0 (for Ubuntu 18.04) or GCC 9.3.0 (for Ubuntu 20.04)
|
||||
- [Python 3.6 - 3.9, 64-bit](https://www.python.org/downloads/windows/)
|
||||
|
||||
## Installing OpenVINO Runtime
|
||||
|
||||
### Step 1: Set Up the OpenVINO Toolkit APT Repository
|
||||
|
||||
1. Install the GPG key for the repository
|
||||
|
||||
a. Download the [GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB](https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB). You can also use the following command:
|
||||
```sh
|
||||
wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
```
|
||||
b. Add this key to the system keyring:
|
||||
```sh
|
||||
sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
```
|
||||
> **NOTE**: You might need to install GnuPG: `sudo apt-get install gnupg`
|
||||
|
||||
2. Add the repository via the following command:
|
||||
@sphinxdirective
|
||||
|
||||
.. tab:: Ubuntu 18
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
echo "deb https://apt.repos.intel.com/openvino/2022 bionic main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list
|
||||
|
||||
.. tab:: Ubuntu 20
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
echo "deb https://apt.repos.intel.com/openvino/2022 focal main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
3. Update the list of packages via the update command:
|
||||
```sh
|
||||
sudo apt update
|
||||
```
|
||||
|
||||
4. Verify that the APT repository is properly set up. Run the apt-cache command to see a list of all available OpenVINO packages and components:
|
||||
```sh
|
||||
apt-cache search openvino
|
||||
```
|
||||
|
||||
|
||||
### Step 2: Install OpenVINO Runtime Using the APT Package Manager
|
||||
|
||||
OpenVINO will be installed in: `/opt/intel/openvino_<VERSION>.<UPDATE>.<PATCH>`
|
||||
|
||||
A symlink will be created: `/opt/intel/openvino_<VERSION>`
|
||||
|
||||
#### To Install the Latest Version
|
||||
|
||||
Run the following command:
|
||||
```sh
|
||||
sudo apt install openvino
|
||||
```
|
||||
|
||||
#### To Install a Specific Version
|
||||
|
||||
|
||||
1. Get a list of OpenVINO packages available for installation:
|
||||
```sh
|
||||
sudo apt-cache search openvino
|
||||
```
|
||||
2. Install a specific version of an OpenVINO package:
|
||||
```sh
|
||||
sudo apt install openvino-<VERSION>.<UPDATE>.<PATCH>
|
||||
```
|
||||
For example:
|
||||
```sh
|
||||
sudo apt install openvino-2022.1.0
|
||||
```
|
||||
|
||||
#### To Check for Installed Packages and Versions
|
||||
|
||||
Run the following command:
|
||||
```sh
|
||||
apt list --installed | grep openvino
|
||||
```
|
||||
|
||||
#### To Uninstall the Latest Version
|
||||
|
||||
Run the following command:
|
||||
```sh
|
||||
sudo apt autoremove openvino
|
||||
```
|
||||
|
||||
#### To Uninstall a Specific Version
|
||||
|
||||
Run the following command:
|
||||
```sh
|
||||
sudo apt autoremove openvino-<VERSION>.<UPDATE>.<PATCH>
|
||||
```
|
||||
|
||||
### Step 3 (Optional): Install OpenCV from APT
|
||||
|
||||
OpenCV is necessary to run C++ demos from Open Model Zoo. Some OpenVINO samples can also extend their capabilities when compiled with OpenCV as a dependency. OpenVINO provides a package to install OpenCV from APT:
|
||||
|
||||
#### To Install the Latest Version of OpenCV
|
||||
|
||||
Run the following command:
|
||||
```sh
|
||||
sudo apt install openvino-opencv
|
||||
```
|
||||
|
||||
#### To Install a Specific Version of OpenCV
|
||||
|
||||
Run the following command:
|
||||
```sh
|
||||
sudo apt install openvino-opencv-<VERSION>.<UPDATE>.<PATCH>
|
||||
```
|
||||
|
||||
### Step 4 (Optional): Configure Inference on Non-CPU Devices
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
.. tab:: GNA
|
||||
|
||||
To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide <gna guide>`.
|
||||
|
||||
.. tab:: GPU
|
||||
|
||||
To enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide <gpu guide>`.
|
||||
|
||||
.. tab:: NCS 2
|
||||
|
||||
To perform inference on Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU, follow the steps on :ref:`NCS2 Setup Guide <ncs guide>`.
|
||||
<!--For more details, see the `Get Started page for Intel® Neural Compute Stick 2 <https://software.intel.com/en-us/neural-compute-stick/get-started>`.-->
|
||||
|
||||
.. tab:: VPU
|
||||
|
||||
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide <vpu guide>`.
|
||||
After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||
|
||||
.. warning::
|
||||
While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine.
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
## Installing OpenVINO Development Tools
|
||||
|
||||
> **NOTE**: From the 2022.1 release, the OpenVINO™ Development Tools can be installed via PyPI only.
|
||||
|
||||
To install OpenVINO Development Tools, do the following steps:
|
||||
1. [Install OpenVINO Runtime](#installing-openvino-runtime) if you haven't done it yet.
|
||||
2. <a href="openvino_docs_install_guides_installing_openvino_linux.html#install-external-dependencies">Install External Software Dependencies</a>.
|
||||
3. See the **For C++ Developers** section in [Install OpenVINO Development Tools](installing-model-dev-tools.md) for detailed steps.
|
||||
|
||||
## What's Next?
|
||||
|
||||
Now you may continue with the following tasks:
|
||||
|
||||
* To convert models for use with OpenVINO, see [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md).
|
||||
* See pre-trained deep learning models in our [Open Model Zoo](../model_zoo.md).
|
||||
* Try out OpenVINO via [OpenVINO Notebooks](https://docs.openvino.ai/latest/notebooks/notebooks.html).
|
||||
* To write your own OpenVINO™ applications, see [OpenVINO Runtime User Guide](../OV_Runtime_UG/openvino_intro.md).
|
||||
* See sample applications in [OpenVINO™ Toolkit Samples Overview](../OV_Runtime_UG/Samples_Overview.md).
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- Intel® Distribution of OpenVINO™ toolkit home page: <https://software.intel.com/en-us/openvino-toolkit>.
|
||||
- For IoT Libraries & Code Samples see the [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit).
|
||||
|
||||
@@ -1,5 +1,74 @@
|
||||
# Install OpenVINO™ Runtime from Anaconda Cloud {#openvino_docs_install_guides_installing_openvino_conda}
|
||||
# Install Intel® Distribution of OpenVINO™ toolkit from Anaconda Cloud {#openvino_docs_install_guides_installing_openvino_conda}
|
||||
|
||||
This guide provides installation steps for Intel® Distribution of OpenVINO™ toolkit for Linux distributed through the Anaconda Cloud.
|
||||
|
||||
> **NOTE**: From the 2022.1 release, the OpenVINO™ Development Tools can only be installed via PyPI. If you want to develop or optimize your models with OpenVINO, see [Install OpenVINO Development Tools](installing-model-dev-tools.md) for detailed steps.
|
||||
|
||||
The other installation methods are temporarily unavailable.
|
||||
## System Requirements
|
||||
|
||||
**Software**
|
||||
|
||||
- [Anaconda distribution](https://www.anaconda.com/products/individual/)
|
||||
|
||||
**Operating Systems**
|
||||
|
||||
| Supported Operating System | [Python Version (64-bit)](https://www.python.org/) |
|
||||
| :------------------------------------------------------------| :---------------------------------------------------|
|
||||
| Ubuntu 18.04 long-term support (LTS), 64-bit | 3.6, 3.7, 3.8, 3.9 |
|
||||
| Ubuntu 20.04 long-term support (LTS), 64-bit | 3.6, 3.7, 3.8, 3.9 |
|
||||
| Red Hat Enterprise Linux 8, 64-bit | 3.6, 3.7, 3.8, 3.9 |
|
||||
| macOS 10.15 | 3.6, 3.7, 3.8, 3.9 |
|
||||
| Windows 10, 64-bit | 3.6, 3.7, 3.8, 3.9 |
|
||||
|
||||
## Install OpenVINO Runtime Using the Anaconda Package Manager
|
||||
|
||||
1. Set up the Anaconda environment (taking Python 3.7 for example):
|
||||
```sh
|
||||
conda create --name py37 python=3.7
|
||||
conda activate py37
|
||||
```
|
||||
2. Update Anaconda environment to the latest version:
|
||||
```sh
|
||||
conda update --all
|
||||
```
|
||||
3. Install the Intel® Distribution of OpenVINO™ toolkit:
|
||||
- Ubuntu* 20.04
|
||||
```sh
|
||||
conda install openvino-ie4py-ubuntu20 -c intel
|
||||
```
|
||||
- Ubuntu* 18.04
|
||||
```sh
|
||||
conda install openvino-ie4py-ubuntu18 -c intel
|
||||
```
|
||||
- Red Hat Enterprise Linux 8, 64-bit
|
||||
```sh
|
||||
conda install openvino-ie4py-rhel8 -c intel
|
||||
```
|
||||
- Windows 10 and macOS
|
||||
```sh
|
||||
conda install openvino-ie4py -c intel
|
||||
```
|
||||
4. Verify the package is installed:
|
||||
```sh
|
||||
python -c "from openvino.runtime import Core"
|
||||
```
|
||||
If installation was successful, you will not see any error messages (no console output).
|
||||
|
||||
Now you can start developing your application.
|
||||
|
||||
|
||||
## What's Next?
|
||||
|
||||
Now you may continue with the following tasks:
|
||||
|
||||
* To convert models for use with OpenVINO, see [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md).
|
||||
* See pre-trained deep learning models in our [Open Model Zoo](../model_zoo.md).
|
||||
* Try out OpenVINO via [OpenVINO Notebooks](https://docs.openvino.ai/latest/notebooks/notebooks.html).
|
||||
* To write your own OpenVINO™ applications, see [OpenVINO Runtime User Guide](../OV_Runtime_UG/openvino_intro.md).
|
||||
* See sample applications in [OpenVINO™ Toolkit Samples Overview](../OV_Runtime_UG/Samples_Overview.md).
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- Intel® Distribution of OpenVINO™ toolkit home page: <https://software.intel.com/en-us/openvino-toolkit>.
|
||||
- For IoT Libraries & Code Samples see the [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit).
|
||||
- Intel® Distribution of OpenVINO™ toolkit Anaconda home page: [https://anaconda.org/intel/openvino-ie4py](https://anaconda.org/intel/openvino-ie4py)
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Install OpenVINO™ Runtime on Linux {#openvino_docs_install_guides_installing_openvino_linux_header}
|
||||
# Install Intel® Distribution of OpenVINO™ Toolkit on Linux {#openvino_docs_install_guides_installing_openvino_linux_header}
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
@@ -15,7 +15,7 @@
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
If you want to install OpenVINO™ Runtime on your Linux machine, there are a few ways to accomplish this. We prepared the following options for you:
|
||||
If you want to install Intel® Distribution of OpenVINO™ toolkit on your Linux machine, there are a few ways to accomplish this. We prepared the following options for you:
|
||||
|
||||
* [Install OpenVINO Runtime Using the Installer](installing-openvino-linux.md)
|
||||
* [Install OpenVINO from PyPI](installing-openvino-pip.md)
|
||||
@@ -23,3 +23,4 @@ If you want to install OpenVINO™ Runtime on your Linux machine, there are a fe
|
||||
* [Install OpenVINO Runtime from YUM](installing-openvino-yum.md)
|
||||
* [Install OpenVINO Runtime from Anaconda Cloud](installing-openvino-conda.md)
|
||||
* [Install OpenVINO with Docker](installing-openvino-docker-linux.md)
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Install OpenVINO™ Runtime on Linux from Installer {#openvino_docs_install_guides_installing_openvino_linux}
|
||||
# Install Intel® Distribution of OpenVINO™ Toolkit on Linux Using the Installer{#openvino_docs_install_guides_installing_openvino_linux}
|
||||
|
||||
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. These tools are now only available on [pypi.org](https://pypi.org/project/openvino-dev/).
|
||||
|
||||
@@ -19,7 +19,7 @@ You can also check the [Release Notes](https://software.intel.com/en-us/articles
|
||||
|
||||
Optimized for these processors:
|
||||
|
||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||
* 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
|
||||
* Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
|
||||
* Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1)
|
||||
@@ -30,9 +30,9 @@ You can also check the [Release Notes](https://software.intel.com/en-us/articles
|
||||
|
||||
.. tab:: Processor Notes
|
||||
|
||||
Processor graphics are not included in all processors.
|
||||
Processor graphics are not included in all processors.
|
||||
See `Product Specifications`_ for information about your processor.
|
||||
|
||||
|
||||
.. _Product Specifications: https://ark.intel.com/
|
||||
|
||||
.. tab:: Software
|
||||
@@ -82,11 +82,11 @@ This guide provides step-by-step instructions on how to install the Intel® Dist
|
||||
<br>You should see the following dialog box open up:
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
|
||||
.. image:: _static/images/openvino-install.png
|
||||
:width: 400px
|
||||
:align: center
|
||||
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
Otherwise, you can add parameters `-a` for additional arguments and `--cli` to run installation in command line (CLI):
|
||||
@@ -94,7 +94,7 @@ This guide provides step-by-step instructions on how to install the Intel® Dist
|
||||
./l_openvino_toolkit_p_<version>.sh -a --cli
|
||||
```
|
||||
> **NOTE**: To get additional information on all parameters that can be used, use the help option: `--help`. Among others, you can find there `-s` option which offers silent mode, which together with `--eula approve` allows you to run whole installation with default values without any user inference.
|
||||
|
||||
|
||||
6. Follow the instructions on your screen. During the installation you will be asked to accept the license agreement. Your acceptance is required to continue. Check the installation process on the image below:<br>
|
||||
|
||||

|
||||
@@ -121,7 +121,7 @@ This script enables you to install Linux platform development tools and componen
|
||||
```sh
|
||||
sudo -E ./install_openvino_dependencies.sh
|
||||
```
|
||||
|
||||
|
||||
Once the dependencies are installed, continue to the next section to set your environment variables.
|
||||
|
||||
## <a name="set-the-environment-variables"></a>Step 3: Configure the Environment
|
||||
@@ -170,7 +170,7 @@ The environment variables are set. Next, you can download some additional tools.
|
||||
.. tab:: VPU
|
||||
|
||||
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide <vpu guide>`.
|
||||
After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||
After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||
|
||||
.. warning::
|
||||
While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine.
|
||||
@@ -178,7 +178,7 @@ The environment variables are set. Next, you can download some additional tools.
|
||||
.. tab:: GNA
|
||||
|
||||
To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide <gna guide>`.
|
||||
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
## <a name="get-started"></a>Step 6: What's Next?
|
||||
@@ -202,13 +202,13 @@ To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalli
|
||||
## Additional Resources
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
|
||||
* :ref:`Troubleshooting Guide for OpenVINO Installation & Configuration <troubleshooting guide for install>`
|
||||
* Converting models for use with OpenVINO™: :ref:`Model Optimizer User Guide <deep learning model optimizer>`
|
||||
* Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide <deep learning openvino runtime>`
|
||||
* Sample applications: :ref:`OpenVINO™ Toolkit Samples Overview <code samples>`
|
||||
* Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models <model zoo>`
|
||||
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_
|
||||
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_
|
||||
|
||||
.. _Intel® IoT Developer Kit: https://github.com/intel-iot-devkit
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Install OpenVINO™ Runtime for macOS {#openvino_docs_install_guides_installing_openvino_macos_header}
|
||||
# Install Intel® Distribution of OpenVINO™ Toolkit on macOS {#openvino_docs_install_guides_installing_openvino_macos_header}
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
@@ -12,7 +12,7 @@
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
If you want to install OpenVINO™ Runtime on macOS, there are a few ways to accomplish this. We prepared following options for you:
|
||||
If you want to install Intel® Distribution of OpenVINO™ toolkit on macOS, there are a few ways to accomplish this. We prepared following options for you:
|
||||
|
||||
* [Install OpenVINO Runtime Using the Installer](installing-openvino-macos.md)
|
||||
* [Install OpenVINO from PyPI](installing-openvino-pip.md)
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Install OpenVINO™ Runtime for macOS from Installer {#openvino_docs_install_guides_installing_openvino_macos}
|
||||
# Install Intel® Distribution of OpenVINO™ toolkit on macOS Using the Installer{#openvino_docs_install_guides_installing_openvino_macos}
|
||||
|
||||
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. These tools are now only available on [pypi.org](https://pypi.org/project/openvino-dev/).
|
||||
|
||||
@@ -17,17 +17,17 @@ You can also check the [Release Notes](https://software.intel.com/en-us/articles
|
||||
|
||||
Optimized for these processors:
|
||||
|
||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||
* 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
|
||||
* Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
|
||||
* Intel® Neural Compute Stick 2
|
||||
|
||||
|
||||
.. note::
|
||||
The current version of the Intel® Distribution of OpenVINO™ toolkit for macOS supports inference on Intel CPUs and Intel® Neural Compute Stick 2 devices only.
|
||||
|
||||
.. tab:: Software Requirements
|
||||
|
||||
* `CMake 3.13 or higher <https://cmake.org/download/>`_ (choose "macOS 10.13 or later"). Add `/Applications/CMake.app/Contents/bin` to path (for default install).
|
||||
* `CMake 3.13 or higher <https://cmake.org/download/>`_ (choose "macOS 10.13 or later"). Add `/Applications/CMake.app/Contents/bin` to path (for default install).
|
||||
* `Python 3.6 - 3.9 <https://www.python.org/downloads/mac-osx/>`_ (choose 3.6 - 3.9). Install and add to path.
|
||||
* Apple Xcode Command Line Tools. In the terminal, run `xcode-select --install` from any directory
|
||||
* (Optional) Apple Xcode IDE (not required for OpenVINO™, but useful for development)
|
||||
@@ -90,7 +90,7 @@ The environment variables are set. Continue to the next section if you want to d
|
||||
|
||||
## <a name="model-optimizer"></a>Step 3 (Optional): Download Additional Components
|
||||
|
||||
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. The OpenVINO™ Development Tools can only be installed via PyPI now. See [Install OpenVINO™ Development Tools](installing-model-dev-tools.md) for detailed steps.
|
||||
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. The OpenVINO™ Development Tools can only be installed via PyPI now. See [Install OpenVINO™ Development Tools](installing-model-dev-tools.md) for detailed steps.
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
@@ -105,7 +105,7 @@ The environment variables are set. Continue to the next section if you want to d
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
## <a name="configure-ncs2"></a>Step 4 (Optional): Configure the Intel® Neural Compute Stick 2
|
||||
## <a name="configure-ncs2"></a>Step 4 (Optional): Configure the Intel® Neural Compute Stick 2
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
@@ -141,12 +141,12 @@ To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalli
|
||||
@sphinxdirective
|
||||
|
||||
.. dropdown:: Additional Resources
|
||||
|
||||
|
||||
* Converting models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide <deep learning model optimizer>`
|
||||
* Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide <deep learning openvino runtime>`
|
||||
* Sample applications: :ref:`OpenVINO™ Toolkit Samples Overview <code samples>`
|
||||
* Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models <model zoo>`
|
||||
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_
|
||||
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_
|
||||
|
||||
<!---
|
||||
To learn more about converting models from specific frameworks, go to:
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Install OpenVINO™ Runtime on Windows {#openvino_docs_install_guides_installing_openvino_windows_header}
|
||||
# Install Intel® Distribution of OpenVINO™ Toolkit on Windows {#openvino_docs_install_guides_installing_openvino_windows_header}
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
If you want to install OpenVINO™ Runtime on Windows, you have the following options:
|
||||
If you want to install Intel® Distribution of OpenVINO™ toolkit on Windows, you have the following options:
|
||||
|
||||
* [Install OpenVINO Runtime Using the installer](installing-openvino-windows.md)
|
||||
* [Install OpenVINO from PyPI](installing-openvino-pip.md)
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Install OpenVINO™ Runtime on Windows from Installer {#openvino_docs_install_guides_installing_openvino_windows}
|
||||
# Install Intel® Distribution of OpenVINO™ Toolkit on Windows 10 Using the Installer{#openvino_docs_install_guides_installing_openvino_windows}
|
||||
|
||||
> **NOTE**: Since the OpenVINO™ 2022.1 release, the following development tools: Model Optimizer, Post-Training Optimization Tool, Model Downloader and other Open Model Zoo tools, Accuracy Checker, and Annotation Converter are not part of the installer. These tools are now only available on [pypi.org](https://pypi.org/project/openvino-dev/).
|
||||
|
||||
@@ -15,7 +15,7 @@ You can also check the [Release Notes](https://software.intel.com/en-us/articles
|
||||
|
||||
Optimized for these processors:
|
||||
|
||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||
* 6th to 12th generation Intel® Core™ processors and Intel® Xeon® processors
|
||||
* 3rd generation Intel® Xeon® Scalable processor (formerly code named Cooper Lake)
|
||||
* Intel® Xeon® Scalable processor (formerly Skylake and Cascade Lake)
|
||||
* Intel Atom® processor with support for Intel® Streaming SIMD Extensions 4.1 (Intel® SSE4.1)
|
||||
@@ -23,12 +23,12 @@ You can also check the [Release Notes](https://software.intel.com/en-us/articles
|
||||
* Intel® Iris® Xe MAX Graphics
|
||||
* Intel® Neural Compute Stick 2
|
||||
* Intel® Vision Accelerator Design with Intel® Movidius™ VPUs
|
||||
|
||||
|
||||
.. tab:: Processor Notes
|
||||
|
||||
Processor graphics are not included in all processors.
|
||||
Processor graphics are not included in all processors.
|
||||
See `Product Specifications`_ for information about your processor.
|
||||
|
||||
|
||||
.. _Product Specifications: https://ark.intel.com/
|
||||
|
||||
.. tab:: Software
|
||||
@@ -38,13 +38,13 @@ You can also check the [Release Notes](https://software.intel.com/en-us/articles
|
||||
* `CMake 3.14 or higher, 64-bit <https://cmake.org/download/>`_ (optional, only required for building sample applications)
|
||||
* For Python developers: `Python 3.6 - 3.9, 64-bit <https://www.python.org/downloads/windows/>`_
|
||||
* Note that OpenVINO is gradually stopping the support for Python 3.6. Python 3.7 - 3.9 are recommended.
|
||||
|
||||
|
||||
.. note::
|
||||
You can choose to download Community version. Use `Microsoft Visual Studio installation guide <https://docs.microsoft.com/en-us/visualstudio/install/install-visual-studio?view=vs-2019>`_ to walk you through the installation. During installation in the **Workloads** tab, choose **Desktop development with C++**.
|
||||
|
||||
.. note::
|
||||
You can either use `cmake<version>.msi` which is the installation wizard or `cmake<version>.zip` where you have to go into the `bin` folder and then manually add the path to environmental variables.
|
||||
|
||||
|
||||
.. important::
|
||||
As part of this installation, make sure you click the option **Add Python 3.x to PATH** to `add Python <https://docs.python.org/3/using/windows.html#installation-steps>`_ to your `PATH` environment variable.
|
||||
|
||||
@@ -64,17 +64,17 @@ This guide provides step-by-step instructions on how to install the Intel® Dist
|
||||
|
||||
1. Download the Intel® Distribution of OpenVINO™ toolkit package file from [Intel® Distribution of OpenVINO™ toolkit for Windows](https://software.intel.com/en-us/openvino-toolkit/choose-download).
|
||||
Select the Intel® Distribution of OpenVINO™ toolkit for Windows package from the dropdown menu.
|
||||
|
||||
|
||||
2. Go to the `Downloads` folder and double-click `w_openvino_toolkit_p_<version>.exe`. In the opened window, you can select the folder where installer files will be placed. The directory will be referred to as <INSTALL_DIR> elsewhere in the documentation. Once the files are extracted, you should see the following dialog box open up:
|
||||
|
||||
@sphinxdirective
|
||||
|
||||
|
||||
.. image:: _static/images/openvino-install.png
|
||||
:width: 400px
|
||||
:align: center
|
||||
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
3. Follow the instructions on your screen. During the installation you will be asked to accept the license agreement. Your acceptance is required to continue. Check out the installation process in the image below:<br>
|
||||

|
||||
Click on the image to see the details.
|
||||
@@ -113,26 +113,26 @@ The environment variables are set. Next, you can download some additional tools.
|
||||
|
||||
.. note::
|
||||
No prerequisites are needed.
|
||||
|
||||
|
||||
There are three ways to run the script:
|
||||
|
||||
|
||||
* GUI: right-click the script and select ``Run with PowerShell``.
|
||||
|
||||
|
||||
* Command prompt (CMD) console:
|
||||
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
|
||||
powershell <INSTALL_DIR>\extras\scripts\download_opencv.ps1
|
||||
|
||||
|
||||
|
||||
|
||||
* PowerShell console:
|
||||
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
.\<INSTALL_DIR>\scripts\download_opencv.ps1
|
||||
|
||||
|
||||
.\<INSTALL_DIR>\scripts\download_opencv.ps1
|
||||
|
||||
|
||||
If the Intel® Distribution of OpenVINO™ is installed to the system location (e.g. ``Program Files (x86)``) then privilege elevation dialog will be shown. The script can be run from CMD/PowerShell Administrator console to avoid this dialog in case of system-wide installation.
|
||||
If the Intel® Distribution of OpenVINO™ is installed to the system location (e.g. ``Program Files (x86)``) then privilege elevation dialog will be shown. The script can be run from CMD/PowerShell Administrator console to avoid this dialog in case of system-wide installation.
|
||||
The script is interactive by default, so during the execution it will wait for user to press ``Enter`` If you want to avoid this, use the ``-batch`` option, e.g. ``powershell <openvino>\extras\scripts\download_opencv.ps1 -batch``. After the execution of the script, you will find OpenCV extracted to ``<INSTALL_DIR>/extras/opencv``.
|
||||
|
||||
@endsphinxdirective
|
||||
@@ -147,15 +147,15 @@ The environment variables are set. Next, you can download some additional tools.
|
||||
.. tab:: VPU
|
||||
|
||||
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide <vpu guide windows>`.
|
||||
|
||||
|
||||
.. tab:: NCS 2
|
||||
|
||||
|
||||
No additional configurations are needed.
|
||||
|
||||
|
||||
.. tab:: GNA
|
||||
|
||||
To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide <gna guide windows>`.
|
||||
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
## <a name="get-started"></a>Step 5: What's next?
|
||||
@@ -171,7 +171,7 @@ To start with C++ samples, see [Build Sample Applications on Windows](../OV_Runt
|
||||
* [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README)
|
||||
* [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README)
|
||||
* [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README)
|
||||
|
||||
|
||||
## <a name="uninstall"></a>Uninstalling the Intel® Distribution of OpenVINO™ Toolkit
|
||||
|
||||
To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalling-openvino.md).
|
||||
@@ -179,15 +179,15 @@ To uninstall the toolkit, follow the steps on the [Uninstalling page](uninstalli
|
||||
@sphinxdirective
|
||||
|
||||
.. dropdown:: Additional Resources
|
||||
|
||||
|
||||
* Converting models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide <deep learning model optimizer>`
|
||||
* Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide <deep learning openvino runtime>`
|
||||
* Sample applications: :ref:`OpenVINO™ Toolkit Samples Overview <code samples>`
|
||||
* Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models <model zoo>`
|
||||
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_
|
||||
|
||||
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit`_
|
||||
|
||||
<!---
|
||||
To learn more about converting models from specific frameworks, go to:
|
||||
To learn more about converting models from specific frameworks, go to:
|
||||
* :ref:`Convert Your Caffe Model <convert model caffe>`
|
||||
* :ref:`Convert Your TensorFlow Model <convert model tf>`
|
||||
* :ref:`Convert Your MXNet Modele <convert model mxnet>`
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Install OpenVINO™ Runtime on Linux Using YUM Repository {#openvino_docs_install_guides_installing_openvino_yum}
|
||||
# Install Intel® Distribution of OpenVINO™ Toolkit for Linux Using YUM Repository {#openvino_docs_install_guides_installing_openvino_yum}
|
||||
|
||||
This guide provides installation steps for Intel® Distribution of OpenVINO™ toolkit for Linux distributed through the YUM repository.
|
||||
|
||||
@@ -130,7 +130,7 @@ sudo yum install openvino-opencv-<VERSION>.<UPDATE>.<PATCH>
|
||||
|
||||
### Step 4 (Optional): Install Software Dependencies
|
||||
|
||||
After you have installed OpenVINO Runtime, if you decided to [install OpenVINO Model Development Tools](installing-model-dev-tools.md), make sure that you install external software dependencies first.
|
||||
After you have installed OpenVINO Runtime, if you decided to [install OpenVINO Model Development Tools](installing-model-dev-tools.md), make sure that you install external software dependencies first.
|
||||
|
||||
Refer to <a href="openvino_docs_install_guides_installing_openvino_linux.html#install-external-dependencies">Install External Software Dependencies</a> for detailed steps.
|
||||
|
||||
@@ -141,7 +141,7 @@ Refer to <a href="openvino_docs_install_guides_installing_openvino_linux.html#in
|
||||
.. tab:: GNA
|
||||
|
||||
To enable the toolkit components to use Intel® Gaussian & Neural Accelerator (GNA) on your system, follow the steps in :ref:`GNA Setup Guide <gna guide>`.
|
||||
|
||||
|
||||
.. tab:: GPU
|
||||
|
||||
To enable the toolkit components to use processor graphics (GPU) on your system, follow the steps in :ref:`GPU Setup Guide <gpu guide>`.
|
||||
@@ -154,7 +154,7 @@ Refer to <a href="openvino_docs_install_guides_installing_openvino_linux.html#in
|
||||
.. tab:: VPU
|
||||
|
||||
To install and configure your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs, see the :ref:`VPU Configuration Guide <vpu guide>`.
|
||||
After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||
After configuration is done, you are ready to run the verification scripts with the HDDL Plugin for your Intel® Vision Accelerator Design with Intel® Movidius™ VPUs.
|
||||
|
||||
.. warning::
|
||||
While working with either HDDL or NCS, choose one of them as they cannot run simultaneously on the same machine.
|
||||
|
||||
Reference in New Issue
Block a user