[DOCS] including NPU documents (#19340)

This commit is contained in:
Karol Blaszczak
2023-08-22 17:17:37 +02:00
committed by GitHub
parent 9a76daf94b
commit 6eee51a6ef
5 changed files with 124 additions and 9 deletions

View File

@@ -2,6 +2,12 @@
@sphinxdirective
.. meta::
:description: The list of types of devices and corresponding plugins which
are compatible with OpenVINO Runtime and support inference
of deep learning models.
.. toctree::
:maxdepth: 1
:hidden:
@@ -9,13 +15,9 @@
openvino_docs_OV_UG_query_api
openvino_docs_OV_UG_supported_plugins_CPU
openvino_docs_OV_UG_supported_plugins_GPU
openvino_docs_OV_UG_supported_plugins_NPU
openvino_docs_OV_UG_supported_plugins_GNA
.. meta::
:description: The list of types of devices and corresponding plugins which
are compatible with OpenVINO Runtime and support inference
of deep learning models.
OpenVINO™ Runtime can infer deep learning models using the following device types:

View File

@@ -0,0 +1,28 @@
# NPU Device {#openvino_docs_OV_UG_supported_plugins_NPU}
@sphinxdirective
.. meta::
:description: The NPU plugin in the Intel® Distribution of OpenVINO™ toolkit
aims at high performance inference of neural
networks on the low-power NPU processing device.
NPU is a new generation of low-power processing unit dedicated to processing neural networks.
The NPU plugin is a core part of the OpenVINO™ toolkit. For its in-depth description, see:
..
- `NPU plugin developer documentation < cmake_options_for_custom_compilation.md ??? >`__.
- `NPU plugin source files < ??? >`__.
@endsphinxdirective

View File

@@ -0,0 +1,71 @@
# Configurations for Intel® NPU with OpenVINO™ {#openvino_docs_install_guides_configurations_for_intel_npu}
@sphinxdirective
.. meta::
:description: Learn how to provide additional configuration for Intel®
NPU to work with the OpenVINO™ toolkit on your system.
Drivers and Dependencies
########################
The Intel® NPU device requires a proper driver to be installed on the system.
Linux
####################
Prerequisites
++++++++++++++++++++
Ensure that make, gcc, and Linux kernel headers are installed. Use the following command to install the required software:
.. code-block:: sh
sudo apt-get install gcc make linux-headers-generic
Configuration steps
++++++++++++++++++++
Windows
####################
Intel® NPU driver for Windows is available through Windows Update.
Whats Next?
####################
Now you are ready to try out OpenVINO™. You can use the following tutorials to write your applications using Python and C++.
* Developing in Python:
* `Start with tensorflow models with OpenVINO™ <notebooks/101-tensorflow-to-openvino-with-output.html>`__
* `Start with ONNX and PyTorch models with OpenVINO™ <notebooks/102-pytorch-onnx-to-openvino-with-output.html>`__
* `Start with PaddlePaddle models with OpenVINO™ <notebooks/103-paddle-to-openvino-classification-with-output.html>`__
* Developing in C++:
* :doc:`Image Classification Async C++ Sample <openvino_inference_engine_samples_classification_sample_async_README>`
* :doc:`Hello Classification C++ Sample <openvino_inference_engine_samples_hello_classification_README>`
* :doc:`Hello Reshape SSD C++ Sample <openvino_inference_engine_samples_hello_reshape_ssd_README>`
@endsphinxdirective

View File

@@ -13,6 +13,7 @@
:hidden:
For GPU <openvino_docs_install_guides_configurations_for_intel_gpu>
For NPU <openvino_docs_install_guides_configurations_for_intel_npu>
For GNA <openvino_docs_install_guides_configurations_for_intel_gna>

View File

@@ -9,12 +9,17 @@
The OpenVINO runtime can infer various models of different input and output formats. Here, you can find configurations
supported by OpenVINO devices, which are CPU, GPU, and GNA (Gaussian Neural Accelerator coprocessor).
supported by OpenVINO devices, which are CPU, GPU, NPU, and GNA (Gaussian Neural Accelerator coprocessor).
Currently, processors of the 11th generation and later (up to the 13th generation at the moment) provide a further performance boost, especially with INT8 models.
.. note::
With OpenVINO™ 2023.0 release, support has been cancelled for all VPU accelerators based on Intel® Movidius™.
With OpenVINO™ 2023.0 release, support has been cancelled for:
- Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X
- Intel® Vision Accelerator Design with Intel® Movidius™
To keep using the MYRIAD and HDDL plugins with your hardware, revert to the OpenVINO 2022.3 LTS release.
+---------------------------------------------------------------------+------------------------------------------------------------------------------------------------------+
@@ -31,7 +36,7 @@ Currently, processors of the 11th generation and later (up to the 13th generatio
|| :doc:`GPU <openvino_docs_OV_UG_supported_plugins_GPU>` | Intel® Processor Graphics including Intel® HD Graphics and Intel® Iris® Graphics, |
|| | Intel® Arc™ A-Series Graphics, Intel® Data Center GPU Flex Series, Intel® Data Center GPU Max Series |
+---------------------------------------------------------------------+------------------------------------------------------------------------------------------------------+
|| :doc:`GNA plugin <openvino_docs_OV_UG_supported_plugins_GNA>` | Intel® Speech Enabling Developer Kit, Amazon Alexa* Premium Far-Field Developer Kit, Intel® |
|| :doc:`GNA <openvino_docs_OV_UG_supported_plugins_GNA>` | Intel® Speech Enabling Developer Kit, Amazon Alexa* Premium Far-Field Developer Kit, Intel® |
|| (available in the Intel® Distribution of OpenVINO™ toolkit) | Pentium® Silver J5005 Processor, Intel® Pentium® Silver N5000 Processor, Intel® |
|| | Celeron® J4005 Processor, Intel® Celeron® J4105 Processor, Intel® Celeron® |
|| | Processor N4100, Intel® Celeron® Processor N4000, Intel® Core™ i3-8121U Processor, |
@@ -41,7 +46,15 @@ Currently, processors of the 11th generation and later (up to the 13th generatio
|| | Intel® Core™ i3-1005G1 Processor, Intel® Core™ i3-1000G1 Processor, |
|| | Intel® Core™ i3-1000G4 Processor |
+---------------------------------------------------------------------+------------------------------------------------------------------------------------------------------+
|| :doc:`NPU <openvino_docs_OV_UG_supported_plugins_NPU>` | |
|| | |
|| | |
|| | |
|| | |
|| | |
|| | |
|| | |
+---------------------------------------------------------------------+------------------------------------------------------------------------------------------------------+
Beside inference using a specific device, OpenVINO offers three inference modes for automated inference management. These are: