* convert to doxygen comments * layouts and code comments * separate layout * Changed layouts * Removed FPGA from the documentation * Updated according to CVS-38225 * some changes * Made changes to benchmarks according to review comments * Added logo info to the Legal_Information, updated Ubuntu, CentOS supported versions * Updated supported Intel® Core™ processors list * Fixed table formatting * update api layouts * Added new index page with overview * Changed CMake and Python versions * Fixed links * some layout changes * some layout changes * some layout changes * COnverted svg images to png * layouts * update layout * Added a label for nGraph_Python_API.md * fixed links * Fixed image * removed links to ../IE_DG/Introduction.md * Removed links to tools overview page as removed * some changes * Remove link to Integrate_your_kernels_into_IE.md * remove openvino_docs_IE_DG_Graph_debug_capabilities from layout as it was removed * update layouts * Post-release fixes and installation path changes * Added PIP installation and Build from Source to the layout * Fixed formatting issue, removed broken link * Renamed section EXAMPLES to RESOURCES according to review comments * add mo faq navigation by url param * Removed DLDT description * Pt 1 * Update Deep_Learning_Model_Optimizer_DevGuide.md * Extra file * Update IR_and_opsets.md * Update Known_Issues_Limitations.md * Update Config_Model_Optimizer.md * Update Convert_Model_From_Kaldi.md * Update Convert_Model_From_Kaldi.md * Update Convert_Model_From_MxNet.md * Update Convert_Model_From_ONNX.md * Update Convert_Model_From_TensorFlow.md * Update Converting_Model_General.md * Update Cutting_Model.md * Update IR_suitable_for_INT8_inference.md * Update Aspire_Tdnn_Model.md * Update Convert_Model_From_Caffe.md * Update Convert_Model_From_TensorFlow.md * Update Convert_Model_From_MxNet.md * Update Convert_Model_From_Kaldi.md * Added references to other fws from each fw * Fixed broken links * Fixed broken links * fixes * fixes * Fixed wrong links Co-authored-by: Nikolay Tyukaev <ntyukaev_lo@jenkins.inn.intel.com> Co-authored-by: Andrey Zaytsev <andrey.zaytsev@intel.com> Co-authored-by: Tyukaev <nikolay.tyukaev@intel.com>
OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository
This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.
This open source version includes two components: namely Model Optimizer and Inference Engine, as well as CPU, GPU and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.
Repository components:
License
Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
Documentation
- OpenVINO™ Release Notes
- OpenVINO™ Inference Engine Build Instructions
- Get Started with Deep Learning Deployment Toolkit on Linux*
- Introduction to Deep Learning Deployment Toolkit
- Inference Engine Developer Guide
- Model Optimizer Developer Guide
How to Contribute
See CONTRIBUTING for contribution to the code. See CONTRIBUTING_DOCS for contribution to the documentation. Thank you!
Support
Please report questions, issues and suggestions using:
- The
openvinotag on StackOverflow* - GitHub* Issues
- Forum
* Other names and brands may be claimed as the property of others.