* [IE]: Allows plugins to disable Gather -> GatherIE conversion Gather layer takes axis as 3rd input, not attribute and may take indices as 0D scalar input Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com> * [IE][VPU]: Disables Gather -> GatherIE conversion Gather -> GatherIE conversion may introduce Gather operation decomposition into Unsqueeze + Gather + Squeeze in case if indices input is 0D scalar input. In case of dynamic Gather such decomposition will break dynamic path. Myriad plugin has to support Gather operation natively without legacy conversion. Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com> * [IE][VPU]: Enables native Gather support Gather layer in contrast with GatherIE takes axis as 3rd input, not attribute and may take indices input as 0D scalar input. 0D -> 1D conversion happens automatically at the beginning of frontend. Axis as 3rd input is supported for single value integral scalar only. Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com> * [IE][VPU][Tests]: Enable new infra single layer Gather tests * Removes corresponding tests from old infrastructure * Enables test cases with 0D indices input * Extracts base test fixture from shared tests fixture. Unfortunately, Google Tests supports Combine generator for tuples of size up to 10 only. Originally, shared tests fixture already has 10 elements in tuple for tests parameters. At the same time myriad plugin needs to specify configuration option. Since configuration option could not be test parameter we are forced to use separate class, in order to get rid of code duplication base class is used. Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com> * [IE][VPU]: Updates firmware Enables native Gather support on device side |
||
---|---|---|
.ci | ||
.github | ||
cmake | ||
docs | ||
inference-engine | ||
licensing | ||
model-optimizer | ||
ngraph | ||
openvino | ||
scripts | ||
tests | ||
tools | ||
.gitattributes | ||
.gitignore | ||
.gitmodules | ||
CMakeLists.txt | ||
CODEOWNERS | ||
install_build_dependencies.sh | ||
Jenkinsfile | ||
LICENSE | ||
README.md | ||
SECURITY.md |
OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository
This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.
This open source version includes several components: namely Model Optimizer, ngraph and Inference Engine, as well as CPU, GPU, MYRIAD, multi device and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.
Repository components:
License
Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
Resources:
- Docs: https://docs.openvinotoolkit.org/
- Wiki: https://github.com/openvinotoolkit/openvino/wiki
- Issue tracking: https://github.com/openvinotoolkit/openvino/issues
- Additional OpenVINO modules: https://github.com/openvinotoolkit/openvino_contrib
- HomePage
- OpenVINO™ Release Notes
Support
Please report questions, issues and suggestions using:
- The
openvino
tag on StackOverflow* - GitHub* Issues
- Forum
* Other names and brands may be claimed as the property of others.