OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
Go to file
Mateusz Bencer 53b865eb58
ONNX Loop (#2847)
* Loop op ngraph implementation, update IE IR Reader and ngraph to cnn converter

* refactoring SubGraphOp class

* type prop unit tests

* ngraph code style

* update comment

* single layer tests for Loop operation

* fix file name

* Add SpecialBodyPorts attribute in Loop op, update single layer tests

* first debug version

* more tests

* missing test file

* removed not needed shapes from test data

* move test data to new folder

* shape infer tests

* Added execution tests

* add several new tests cases, strict checks in Loop impl, temporary disable single layer tests

* ngraph codestyle, refactoring, clone_new_args test

* resolve review remarks

* fix build

* fix tests

* more execution tests

* add a new constructor of Loop op, resolve review remarks

* execution tests

* synchro with current version

* handle scalars and more tests

* scalar test enabled

* loop reference impl

* bug fixes in tests, onnx importer part and in the ref implementation of the Loop op

* applied remarks

* handle unsupported cases

* rewrite unit tests

* update INTERPRETER manifest

* is_termination_condition_always_true simplification

* [TEST] update python models tests

* review remarks

* added xfail to tiny_yolov3

* missing model test

* revert test data

* fixed numbers of failing tests

* fixed failed test description

* fix test message

* fix xfail test

* zoo models tests clean-up

* missing comma

Co-authored-by: Ivan Tikhonov <ivan.tikhonov@intel.com>
2020-10-29 09:51:51 +03:00
.ci Add watchdog of OpenVino ONNX CI (#2550) 2020-10-23 14:16:43 +02:00
.github GitHub CI: Add files_size.yml (#2570) 2020-10-13 13:27:34 +03:00
cmake Openvino extra module adding - refactored (#2754) 2020-10-23 08:54:48 +03:00
docs Added information about unsupported subgraphs for set affinity (#2872) 2020-10-29 06:22:25 +03:00
inference-engine [IE CLDNN] Fix accuracy bug in fsv16 imad conv + other minor fixes (#2876) 2020-10-29 09:33:05 +03:00
licensing added third party programs files (#2751) 2020-10-23 18:03:01 +03:00
model-optimizer ONNX Loop operation support (#2756) 2020-10-27 23:04:43 +03:00
ngraph ONNX Loop (#2847) 2020-10-29 09:51:51 +03:00
openvino Fix itt build (#2662) 2020-10-14 18:35:21 +03:00
scripts setupvars.bat: Renamed all Python version related variables. (#2854) 2020-10-28 21:02:25 +03:00
tests Add validate_test_case fixture with using of jsonschema. Specify all required fields for test cases (#2821) 2020-10-29 00:11:01 +03:00
tools Supported threading command line options for other devices (#2725) 2020-10-21 06:40:18 +03:00
.gitattributes Doc Migration (master) (#1377) 2020-07-20 17:36:08 +03:00
.gitignore publish master branch snapshot, revision 8d31237e2c3f673cbb0f0ba110fc10f5cce1d2bb 2020-05-22 02:23:12 +03:00
.gitmodules add submodules for mkl-dnn, gflags and gtest 2020-05-21 23:00:55 +03:00
build-instruction.md Feature/azaytsev/merge to master (#2786) 2020-10-27 00:41:46 +03:00
CMakeLists.txt Removed obsolete comments from cmake (#2748) 2020-10-22 16:11:28 +03:00
CODEOWNERS Added code owners for scripts folder (#2130) 2020-09-08 17:23:27 +03:00
CONTRIBUTING_DOCS.md docs contribution guides (#1535) 2020-08-07 15:33:11 +03:00
CONTRIBUTING.md Create CONTRIBUTING.md 2020-05-19 19:04:27 +03:00
get-started-linux.md Updating broken link on getting started linux doc (#2507) 2020-10-16 19:02:41 +03:00
install_build_dependencies.sh [install_dependencies.sh] install latest cmake if current version is lower 3.13 (#2695) 2020-10-16 21:03:46 +03:00
Jenkinsfile Bump infra 2020-10-27 15:12:33 +03:00
LICENSE Publishing R3 2018-10-16 13:45:03 +03:00
README.md doc: add openvino tag link on StackOverflow (#2585) 2020-10-08 16:17:30 +03:00
SECURITY.md Fix link in SECURITY.md (#2259) 2020-09-21 21:35:24 +03:00

OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository

Stable release Apache License Version 2.0 Azure DevOps builds (branch)

This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.

This open source version includes two components: namely Model Optimizer and Inference Engine, as well as CPU, GPU and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.

Repository components:

License

Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Documentation

How to Contribute

See CONTRIBUTING for contribution to the code. See CONTRIBUTING_DOCS for contribution to the documentation. Thank you!

Support

Please report questions, issues and suggestions using:


* Other names and brands may be claimed as the property of others.