OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
Go to file
Gladilov, Gleb 8213505e24
[IE][VPU]: Enables native gather support (#3502)
* [IE]: Allows plugins to disable Gather -> GatherIE conversion

Gather layer takes axis as 3rd input, not attribute and may
take indices as 0D scalar input

Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com>

* [IE][VPU]: Disables Gather -> GatherIE conversion

Gather -> GatherIE conversion may introduce Gather
operation decomposition into Unsqueeze + Gather +
Squeeze in case if indices input is 0D scalar input.

In case of dynamic Gather such decomposition will
break dynamic path. Myriad plugin has to support
Gather operation natively without legacy conversion.

Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com>

* [IE][VPU]: Enables native Gather support

Gather layer in contrast with GatherIE takes
axis as 3rd input, not attribute and may take
indices input as 0D scalar input.

0D -> 1D conversion happens automatically at
the beginning of frontend.

Axis as 3rd input is supported for single value
integral scalar only.

Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com>

* [IE][VPU][Tests]: Enable new infra single layer Gather tests

* Removes corresponding tests from old infrastructure
* Enables test cases with 0D indices input
* Extracts base test fixture from shared tests fixture.
  Unfortunately, Google Tests supports Combine generator
  for tuples of size up to 10 only. Originally, shared
  tests fixture already has 10 elements in tuple for
  tests parameters. At the same time myriad plugin needs
  to specify configuration option. Since configuration
  option could not be test parameter we are forced to
  use separate class, in order to get rid of code
  duplication base class is used.

Signed-off-by: Gladilov, Gleb <gleb.gladilov@intel.com>

* [IE][VPU]: Updates firmware

Enables native Gather support on device side
2020-12-10 13:23:36 +03:00
.ci Template device testing (#3521) 2020-12-09 17:13:32 +03:00
.github Remove Java bindings (#3216) 2020-11-19 13:59:20 +03:00
cmake Template device testing (#3521) 2020-12-09 17:13:32 +03:00
docs cmake build all docs (#3539) 2020-12-10 12:11:30 +03:00
inference-engine [IE][VPU]: Enables native gather support (#3502) 2020-12-10 13:23:36 +03:00
licensing added third party programs files (#2751) 2020-10-23 18:03:01 +03:00
model-optimizer Re-implement onnx old-style extractors with extractor extensions (#3459) 2020-12-10 09:24:24 +03:00
ngraph Removed reference implementations from tests (#3541) 2020-12-10 12:00:48 +03:00
openvino CPU plugin selective build (#3360) 2020-12-07 17:49:08 +03:00
scripts setupvars should export TBB_DIR (44241) (#3434) 2020-12-02 14:18:59 +03:00
tests Build time_tests with OpenVINO install (#3484) 2020-12-07 12:56:37 +03:00
tools benchmark_tool: replace logger.warn with logger.warning (#3291) 2020-11-24 06:19:29 +03:00
.gitattributes Doc Migration (master) (#1377) 2020-07-20 17:36:08 +03:00
.gitignore publish master branch snapshot, revision 8d31237e2c3f673cbb0f0ba110fc10f5cce1d2bb 2020-05-22 02:23:12 +03:00
.gitmodules add submodules for mkl-dnn, gflags and gtest 2020-05-21 23:00:55 +03:00
CMakeLists.txt Template device testing (#3521) 2020-12-09 17:13:32 +03:00
CODEOWNERS Added code owners for scripts folder (#2130) 2020-09-08 17:23:27 +03:00
install_build_dependencies.sh [install_dependencies.sh] install latest cmake if current version is lower 3.13 (#2695) 2020-10-16 21:03:46 +03:00
Jenkinsfile [Jenkinsfile] Disable failFast & enable propagateStatus (#3503) 2020-12-10 12:05:03 +03:00
LICENSE Publishing R3 2018-10-16 13:45:03 +03:00
README.md Removed documents which are ported to OpenVINO WiKi (#3106) 2020-11-17 11:46:05 +03:00
SECURITY.md Added SECURITY.md back (#3177) 2020-11-17 16:44:44 +03:00

OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository

Stable release Apache License Version 2.0 Azure DevOps builds (branch)

This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.

This open source version includes several components: namely Model Optimizer, ngraph and Inference Engine, as well as CPU, GPU, MYRIAD, multi device and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.

Repository components:

License

Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Resources:

Support

Please report questions, issues and suggestions using:


* Other names and brands may be claimed as the property of others.