* Model caching support - Core part Introducing model caching support Use core.SetConfig({{CONFIG_KEY(CACHE_DIR), <dir>}}); to enable caching of models OpenVINO will try to create caching folder if it doesn't exist, but it is recommended for client to create caching folder with necessary permissions before enabling cache in config For caching, plugins shall support import/export functionality Plugin requirements: - Add METRIC_KEY(IMPORT_EXPORT_SUPPORT) in SUPPORTED_METRICS to support caching If plugin has different device architectures with different caches, i.e. For "GNA.0" - one cache, for "GNA.10" - another cache In this case plugin shall support DEVICE_ARCHITECTURE metric and return different strings for different DEVICE_ID's Added functional tests * Fix CentOS build issues * Few updates according to code review * Revert unnecessary changes for Import/Export core implementation These changes affect old behavior and may be undesired For caching support these is no need to change anything in this area If needed, such removal of 'Magic' usage can be done under separate task in future * More tests: 1) Verify that Imported data from stream is the same as was exported 2) Verify that cache is not loaded when config in LoadNetwork is changed 3) Verify that if CNN Network is changed between ReadNetwork and LoadNetwork - cache is not loaded * Update of NetworkCompilationContext Put back functionality of calculating hash based on runtime information, weights Implemented OstreamHashWrapper to avoid serialization to buffer * Correction of CACHE_DIR key description * Unit tests for compilation_context Changes: 1) Improved handling of OstreamHashAdapter 2) Improved runtime info serialization (not just PrimitivesPriority and affinity) 3) Removed redundant weights hash calculation * Fix GCC 4.8 build issues * Compilation context updates 1) Use hash of sum of serialized data to get hash of network. It is more efficient comparing to weights sum calculation 2) CalculateFileInfo - convert path to absolute ("./test.blob" and "test.blob" shall give same hash) * Hash - added more rt_info attributes + tests - PrimitivesPriority - FusedNames - Dequantization * Moved "get_absolute_path" macro to file_utils.h * Make 'absoluteFilePath' a library API, not macro * One more unit test for fileName hashing * Fix compilation error after merge with latest master * Allow tests to be executed in parallel (stress mode) * More minor updates for stress testing Now it allows to execute tests with '--repeat=100' option where one test is executed in multiple processes simultaneously Example: ./gtest-parallel <openvino_dir>/bin/intel64/Debug/ieFuncTests --gtest_filter=CachingTest* --repeat=10 * Use absolute model file path for calculating blob name * Added 'createDirectoryRecursive' API to plugin_api/file_utils |
||
---|---|---|
.ci | ||
.github | ||
cmake | ||
docs | ||
inference-engine | ||
licensing | ||
model-optimizer | ||
ngraph | ||
openvino | ||
scripts | ||
tests | ||
thirdparty | ||
tools | ||
.gitattributes | ||
.gitignore | ||
.gitmodules | ||
CMakeLists.txt | ||
CODEOWNERS | ||
install_build_dependencies.sh | ||
Jenkinsfile | ||
LICENSE | ||
README.md | ||
SECURITY.md |
OpenVINO™ Toolkit
This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.
This open source version includes several components: namely Model Optimizer, nGraph and Inference Engine, as well as CPU, GPU, MYRIAD, multi device and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.
Repository components:
License
Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
Resources:
- Docs: https://docs.openvinotoolkit.org/
- Wiki: https://github.com/openvinotoolkit/openvino/wiki
- Issue tracking: https://github.com/openvinotoolkit/openvino/issues
- Storage: https://storage.openvinotoolkit.org/
- Additional OpenVINO™ modules: https://github.com/openvinotoolkit/openvino_contrib
- Intel® Distribution of OpenVINO™ toolkit Product Page
- Intel® Distribution of OpenVINO™ toolkit Release Notes
Support
Please report questions, issues and suggestions using:
- The
openvino
tag on StackOverflow* - GitHub* Issues
- Forum
* Other names and brands may be claimed as the property of others.