* Updated properties documentation * Fixed doc refernce * merged snipet files * fixed build * Updated Hetero docs * Self-review Co-authored-by: Anton Pankratv <anton.pankratov@intel.com>
3.9 KiB
Device Plugin Support
@sphinxdirective
.. toctree:: :maxdepth: 1 :hidden:
openvino_docs_IE_DG_InferenceEngine_QueryAPI openvino_docs_IE_DG_supported_plugins_CPU openvino_docs_IE_DG_supported_plugins_GPU openvino_docs_IE_DG_supported_plugins_VPU openvino_docs_IE_DG_supported_plugins_GNA
@endsphinxdirective
Inference Engine uses a plugin architecture. Inference Engine plugin is a software component that contains complete implementation for inference on a certain Intel® hardware device: CPU, GPU, VPU, GNA, etc. Each plugin implements the unified API and provides additional hardware-specific APIs.
The Inference Engine provides capabilities to infer deep learning models on the following device types with corresponding plugins:
| Plugin | Device types |
|---|---|
| GPU plugin | Intel® Processor Graphics, including Intel® HD Graphics and Intel® Iris® Graphics |
| CPU plugin | Intel® Xeon® with Intel® Advanced Vector Extensions 2 (Intel® AVX2), Intel® Advanced Vector Extensions 512 (Intel® AVX-512), and AVX512_BF16, Intel® Core™ Processors with Intel® AVX2, Intel® Atom® Processors with Intel® Streaming SIMD Extensions (Intel® SSE) |
| VPU plugins (available in the Intel® Distribution of OpenVINO™ toolkit) | Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X, Intel® Vision Accelerator Design with Intel® Movidius™ VPUs |
| GNA plugin (available in the Intel® Distribution of OpenVINO™ toolkit) | Intel® Speech Enabling Developer Kit, Amazon Alexa* Premium Far-Field Developer Kit, Intel® Pentium® Silver J5005 Processor, Intel® Pentium® Silver N5000 Processor, Intel® Celeron® J4005 Processor, Intel® Celeron® J4105 Processor, Intel® Celeron® Processor N4100, Intel® Celeron® Processor N4000, Intel® Core™ i3-8121U Processor, Intel® Core™ i7-1065G7 Processor, Intel® Core™ i7-1060G7 Processor, Intel® Core™ i5-1035G4 Processor, Intel® Core™ i5-1035G7 Processor, Intel® Core™ i5-1035G1 Processor, Intel® Core™ i5-1030G7 Processor, Intel® Core™ i5-1030G4 Processor, Intel® Core™ i3-1005G1 Processor, Intel® Core™ i3-1000G1 Processor, Intel® Core™ i3-1000G4 Processor |
| Multi-Device plugin | Multi-Device plugin enables simultaneous inference of the same network on several Intel® devices in parallel |
| Auto-Device plugin | Auto-Device plugin enables selecting Intel® device for inference automatically |
| Heterogeneous plugin | Heterogeneous plugin enables automatic inference splitting between several Intel® devices (for example if a device doesn't support certain layers). |
Devices similar to the ones we have used for benchmarking can be accessed using Intel® DevCloud for the Edge, a remote development environment with access to Intel® hardware and the latest versions of the Intel® Distribution of the OpenVINO™ Toolkit. Learn more or Register here.