OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
Go to file
Evgeny Lazarev f596432268
NMS-4 op support (#1115)
* Specification for the NMS-4 operation (updated shape infer function)

* Enabled NMS-4 in the Model Optimizer

* Changed opset version for NMS with dynamic outputs and namespace to be "dynamic"

* Added NMS-4

* Added opset4 to the nGraph

* Added unit tests for NMS-4 type infer

* Renamed UpgradeNMS3ToNMS4 to UpgradeNMS3ToNMSDynamic. Added stub for ConvertNMS4ToLegacy

* Make IE aware of opset4 ops

* Updated NMSIE to have different shape infer function based on the NMS it was converted from. Implemented NMS4->NMSIE conversion

* Apply code style

* Updated StaticShapeNonMaximumSuppression op in the VPU

* Introduced new version of NMSIE operation with shape infer function from v4::NMS

* Fixed dynamicToStaticNonMaxSuppression transformation

* Added new version of NMSIE op with updated shape infer function

* Fixed NMS4 to NMSIE2 transformation

* Fixed constructors for nGraph ops v4::NM and dynamic::NMS

* Updated text in the opset4 specification document

* Code style fixes

* Fixed constructors for StaticShapeNMS + fixed test

* Minor change to the NMS op in the MO

* Fixed typo in the dynamic_to_static_shape_non_max_suppression transformation

* Removed redundant checks

* Refactored NMS infer and validate functions

* Added more checks to the validate_and_infer_types functions for NMS-3 and NMS-4

* Fixed compilation issue on Windows for op NMS

* Code style fixes

* Fixed typos in the NMSIE and NMSIE2 to CNNLayer op conversion

* Fixed typo in the ie_cnn_layer_builder_ngraph.cpp

* Fixed the NMSToLegacyNMS transformation. Added unit tests

* Apply code review comments

* Refactored NMSIE to use visitors

* Removed calling ConvertNMS4ToLegacy in the common optimizations

* Moved NMS4ToNMSLegacy to convert1_to_legacy group of transformations

* Removed useless include statement

* Removed copy-paste issue

Co-authored-by: Evgeny Lazarev <elazarev.nnov@gmail.com>
2020-06-30 14:04:31 +03:00
.github/workflows Actions CI: Enable nGraph Code style check (#863) 2020-06-10 16:18:53 +03:00
cmake CMAKE: cross-compilation for ia32 (#1074) 2020-06-23 12:35:17 +03:00
docs NMS-4 op support (#1115) 2020-06-30 14:04:31 +03:00
inference-engine NMS-4 op support (#1115) 2020-06-30 14:04:31 +03:00
model-optimizer NMS-4 op support (#1115) 2020-06-30 14:04:31 +03:00
ngraph NMS-4 op support (#1115) 2020-06-30 14:04:31 +03:00
scripts [OpenVINO scripts] Fixed *.sh files index from 644 to 755 (#664) 2020-05-29 13:50:17 +03:00
tests Removed ICNNNetReader interface (#1042) 2020-06-23 22:34:26 +03:00
tools [IE TOOLS] Use input_info in python benchmark app (#660) 2020-05-29 21:28:17 +03:00
.gitattributes Publishing R3 2018-10-16 13:45:03 +03:00
.gitignore publish master branch snapshot, revision 8d31237e2c3f673cbb0f0ba110fc10f5cce1d2bb 2020-05-22 02:23:12 +03:00
.gitmodules add submodules for mkl-dnn, gflags and gtest 2020-05-21 23:00:55 +03:00
azure-pipelines.yml Azure CI: Add gtest-parallel on Lin & Mac (#980) 2020-06-18 19:20:03 +03:00
build-instruction.md [Docs] Fixes in readme files: (#750) 2020-06-03 20:14:35 +03:00
CMakeLists.txt Fixed default args for Android build (#827) 2020-06-09 18:02:03 +03:00
CODEOWNERS Fixed CODEOWNERS paths (#684) 2020-05-29 20:57:32 +03:00
CONTRIBUTING.md Create CONTRIBUTING.md 2020-05-19 19:04:27 +03:00
get-started-linux.md [Docs] Fixes in readme files: (#750) 2020-06-03 20:14:35 +03:00
install_dependencies.sh [Docs] Fixes in readme files: (#750) 2020-06-03 20:14:35 +03:00
Jenkinsfile [Jenkinsfile] Add failFast parameter (#721) 2020-06-02 20:22:25 +03:00
LICENSE Publishing R3 2018-10-16 13:45:03 +03:00
README.md [Docs] Fixes in readme files: (#750) 2020-06-03 20:14:35 +03:00

OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository

Stable release Apache License Version 2.0

This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.

This open source version includes two components: namely Model Optimizer and Inference Engine, as well as CPU, GPU and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.

Repository components:

License

Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Documentation

How to Contribute

See CONTRIBUTING for details. Thank you!

Support

Please report questions, issues and suggestions using:


* Other names and brands may be claimed as the property of others.