OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
Go to file
Bartosz Lesniewski 2da6546841
Remove ops from Layer Creator/ Node Converter - part 2 (#3226)
* remove power op from layer creator

* remove prelu op from layer creator

* remove tile op from layer creator

* remove relu op from layer creator

* remove selu op from layer creator

* remove softmax op from layer creator

* remove tanh op from layer creator

* remove split op from layer creator

* remove reshape op from layer creator

* remove reverse sequence op from layer creator

* remove proposal op from layer creator

* remove priorbox op from layer creator

* remove roipooling op from layer creator

* remove priorboxclustered op from layer creator

* style fix

* utility function to parse bool-containing strings

* align priorbox scale_all_sizes parameter to specification

* change location of getBoolStrParamAsIntStr function

* align prelu creator to new constant op changes

* adjust priorbox tests to align with scale_all_sizes default value

* adjust priorbox python tests to align with scale_all_sizes default value

* align priorboxclustered attributes initlialization to specification

* fix checking wrong container's end iterator for opset name search

* improve comment on roipooling parameters

* Apply review suggestion 1

Co-authored-by: Ilya Churaev <ilyachur@gmail.com>

* Apply review suggestion 2

Co-authored-by: Ilya Churaev <ilyachur@gmail.com>

* align priorbox step initial value to specification

* align roipooling method attribute to specification

* remove roipooling specific creator

* align with review comments

Co-authored-by: Ilya Churaev <ilyachur@gmail.com>
2020-12-04 19:49:36 +03:00
.ci Azure: Add contrib build (#3414) 2020-11-30 15:48:07 +03:00
.github Remove Java bindings (#3216) 2020-11-19 13:59:20 +03:00
cmake Documentation fixes (#3418) 2020-12-01 06:33:36 +03:00
docs Removed global using namespace from Plugin API (#3451) 2020-12-03 17:52:55 +03:00
inference-engine Remove ops from Layer Creator/ Node Converter - part 2 (#3226) 2020-12-04 19:49:36 +03:00
licensing added third party programs files (#2751) 2020-10-23 18:03:01 +03:00
model-optimizer Fix mode attribute value in DepthToSpace ONNX operation extractor (#3466) 2020-12-03 19:33:07 +03:00
ngraph Remove ops from Layer Creator/ Node Converter - part 2 (#3226) 2020-12-04 19:49:36 +03:00
openvino Enabled code-style for OpenVINO folder (#3385) 2020-11-27 06:21:30 +03:00
scripts setupvars should export TBB_DIR (44241) (#3434) 2020-12-02 14:18:59 +03:00
tests Fix paths for squeezenet1.1 in time_tests config (#3415) 2020-11-30 18:45:08 +03:00
tools benchmark_tool: replace logger.warn with logger.warning (#3291) 2020-11-24 06:19:29 +03:00
.gitattributes Doc Migration (master) (#1377) 2020-07-20 17:36:08 +03:00
.gitignore publish master branch snapshot, revision 8d31237e2c3f673cbb0f0ba110fc10f5cce1d2bb 2020-05-22 02:23:12 +03:00
.gitmodules add submodules for mkl-dnn, gflags and gtest 2020-05-21 23:00:55 +03:00
CMakeLists.txt CMake installation rules for 3rd party components (#2944) 2020-11-30 12:29:30 +03:00
CODEOWNERS Added code owners for scripts folder (#2130) 2020-09-08 17:23:27 +03:00
install_build_dependencies.sh [install_dependencies.sh] install latest cmake if current version is lower 3.13 (#2695) 2020-10-16 21:03:46 +03:00
Jenkinsfile [Jenkinsfile] Add propagateStatus parameter (#3336) 2020-11-25 16:07:39 +03:00
LICENSE Publishing R3 2018-10-16 13:45:03 +03:00
README.md Removed documents which are ported to OpenVINO WiKi (#3106) 2020-11-17 11:46:05 +03:00
SECURITY.md Added SECURITY.md back (#3177) 2020-11-17 16:44:44 +03:00

OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository

Stable release Apache License Version 2.0 Azure DevOps builds (branch)

This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.

This open source version includes several components: namely Model Optimizer, ngraph and Inference Engine, as well as CPU, GPU, MYRIAD, multi device and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.

Repository components:

License

Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Resources:

Support

Please report questions, issues and suggestions using:


* Other names and brands may be claimed as the property of others.