Files
openvino/docs/documentation.md
Ilya Lavrenov a883dc0b85 DOCS: ported changes from 2022.1 release branch (#11206)
* Extensibility guide with FE extensions and remove OV_FRAMEWORK_MAP from docs

* Rework of Extensibility Intro, adopted examples to missing OPENVINO_FRAMEWORK_MAP

* Removed OPENVINO_FRAMEWORK_MAP reference

* Frontend extension detailed documentation

* Fixed distributed snippets

* Fixed snippet inclusion in FE extension document and chapter headers

* Fixed wrong name in a snippet reference

* Fixed test for template extension due to changed number of loaded extensions

* Update docs/Extensibility_UG/frontend_extensions.md

Co-authored-by: Ivan Tikhonov <ivan.tikhonov@intel.com>

* Minor fixes in extension snippets

* Small grammar fix

Co-authored-by: Ivan Tikhonov <ivan.tikhonov@intel.com>

Co-authored-by: Ivan Tikhonov <ivan.tikhonov@intel.com>

* DOCS: transition banner (#10973)

* transition banner

* minor fix

* update transition banner

* updates

* update custom.js

* updates

* updates

* Documentation fixes (#11044)

* Benchmark app usage

* Fixed link to the devices

* More fixes

* Update docs/OV_Runtime_UG/multi_device.md

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* Removed several hardcoded links

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* Updated documentation for compile_tool (#11049)

* Added deployment guide (#11060)

* Added deployment guide

* Added local distribution

* Updates

* Fixed more indentations

* Removed obsolete code snippets (#11061)

* Removed obsolete code snippets

* NCC style

* Fixed NCC for BA

* Add a troubleshooting issue for PRC installation (#11074)

* updates

* adding gna to linux

* add missing reference

* update

* Update docs/install_guides/installing-model-dev-tools.md

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* Update docs/install_guides/installing-model-dev-tools.md

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* Update docs/install_guides/installing-model-dev-tools.md

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* Update docs/install_guides/installing-model-dev-tools.md

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* Update docs/install_guides/installing-model-dev-tools.md

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* update

* minor updates

* add gna item to yum and apt

* add gna to get started page

* update reference formatting

* merge commit

* add a troubleshooting issue

* update

* update

* fix CVS-71846

Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>

* DOCS: fixed hardcoded links  (#11100)

* Fixes

* Use links

* applying reviewers comments to the Opt Guide (#11093)

* applying reviewrs comments

* fixed refs, more structuring (bold, bullets, etc)

* refactoring tput/latency sections

* next iteration (mostly latency), also brushed the auto-batching and other sections

* updates sync/async images

* common opts brushed

* WIP tput redesigned

* minor brushing of common and auto-batching

* Tput fully refactored

* fixed doc name in the link

* moved int8 perf counters to the right section

* fixed links

* fixed broken quotes

* fixed more links

* add ref to the internals to the TOC

* Added a note on the batch size

Co-authored-by: Andrey Zaytsev <andrey.zaytsev@intel.com>

* [80085] New images for docs (#11114)

* change doc structure

* fix manager tools

* fix manager tools 3 step

* fix manager tools 3 step

* new img

* new img for OV Runtime

* fix steps

* steps

* fix intendents

* change list

* fix space

* fix space

* code snippets fix

* change display

* Benchmarks 2022 1 (#11130)

* Minor fixes

* Updates for 2022.1

* Edits according to the review

* Edits according to review comments

* Edits according to review comments

* Edits according to review comments

* Fixed table

* Edits according to review comments

* Removed config for Intel® Core™ i7-11850HE

* Removed forward-tacotron-duration-prediction-241 graph

* Added resnet-18-pytorch

* Add info about Docker images in Deployment guide (#11136)

* Renamed user guides (#11137)

* fix screenshot (#11140)

* More conservative recommendations on dynamic shapes usage in docs (#11161)

* More conservative recommendations about using dynamic shapes

* Duplicated statement from C++ part to Python part of reshape doc (no semantical changes)

* Update ShapeInference.md (#11168)

* Benchmarks 2022 1 updates (#11180)

* Updated graphs

* Quick fix for TODO in Dynamic Shapes article

* Anchor link fixes

* Fixed DM config (#11199)

* DOCS: doxy sphinxtabs (#11027)

* initial implementation of doxy sphinxtabs

* fixes

* fixes

* fixes

* fixes

* fixes

* WA for ignored visibility attribute

* Fixes

Co-authored-by: Sergey Lyalin <sergey.lyalin@intel.com>
Co-authored-by: Ivan Tikhonov <ivan.tikhonov@intel.com>
Co-authored-by: Nikolay Tyukaev <nikolay.tyukaev@intel.com>
Co-authored-by: Sergey Lyubimtsev <sergey.lyubimtsev@intel.com>
Co-authored-by: Yuan Xu <yuan1.xu@intel.com>
Co-authored-by: Maxim Shevtsov <maxim.y.shevtsov@intel.com>
Co-authored-by: Andrey Zaytsev <andrey.zaytsev@intel.com>
Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>
Co-authored-by: Ilya Naumov <ilya.naumov@intel.com>
Co-authored-by: Evgenya Stepyreva <evgenya.stepyreva@intel.com>
2022-03-24 22:27:29 +03:00

5.2 KiB

Documentation

@sphinxdirective

.. toctree:: :maxdepth: 1 :caption: Converting and Preparing Models :hidden:

openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide omz_tools_downloader

.. toctree:: :maxdepth: 1 :caption: Deploying Inference :hidden:

openvino_docs_OV_Runtime_User_Guide openvino_2_0_transition_guide openvino_deployment_guide openvino_inference_engine_tools_compile_tool_README

.. toctree:: :maxdepth: 1 :caption: Tuning for Performance :hidden:

openvino_docs_optimization_guide_dldt_optimization_guide openvino_docs_MO_DG_Getting_Performance_Numbers openvino_docs_model_optimization_guide openvino_docs_deployment_optimization_guide_dldt_optimization_guide openvino_docs_tuning_utilities openvino_docs_performance_benchmarks

.. toctree:: :maxdepth: 1 :caption: Graphical Web Interface for OpenVINO™ toolkit
:hidden:

workbench_docs_Workbench_DG_Introduction workbench_docs_Workbench_DG_Install workbench_docs_Workbench_DG_Work_with_Models_and_Sample_Datasets Tutorials <workbench_docs_Workbench_DG_Tutorials> User Guide <workbench_docs_Workbench_DG_User_Guide> workbench_docs_Workbench_DG_Troubleshooting

.. toctree:: :maxdepth: 1 :hidden: :caption: Media Processing and Computer Vision Libraries

Intel® Deep Learning Streamer <openvino_docs_dlstreamer> openvino_docs_gapi_gapi_intro OpenCV* Developer Guide https://docs.opencv.org/master/ OpenCL™ Developer Guide https://software.intel.com/en-us/openclsdk-devguide

.. toctree:: :maxdepth: 1 :caption: Add-Ons :hidden:

ovms_what_is_openvino_model_server ote_documentation ovsa_get_started

.. toctree:: :maxdepth: 1 :caption: OpenVINO Extensibility :hidden:

openvino_docs_Extensibility_UG_Intro openvino_docs_transformations OpenVINO Plugin Developer Guide <openvino_docs_ie_plugin_dg_overview>

.. toctree:: :maxdepth: 1 :hidden: :caption: Use OpenVINO™ Toolkit Securely

openvino_docs_security_guide_introduction openvino_docs_security_guide_workbench openvino_docs_IE_DG_protecting_model_guide ovsa_get_started

@endsphinxdirective

This section provides reference documents that guide you through developing your own deep learning applications with the OpenVINO™ toolkit. These documents will most helpful if you have first gone through the Get Started guide.

Converting and Preparing Models

With the [Model Downloader](@ref omz_tools_downloader) and Model Optimizer guides, you will learn to download pre-trained models and convert them for use with the OpenVINO™ toolkit. You can provide your own model or choose a public or Intel model from a broad selection provided in the Open Model Zoo.

Deploying Inference

The OpenVINO™ Runtime User Guide explains the process of creating your own application that runs inference with the OpenVINO™ toolkit. The API Reference defines the OpenVINO Runtime API for Python, C++, and C. The OpenVINO Runtime API is what you'll use to create an OpenVINO™ inference application, use enhanced operations sets and other features. After writing your application, you can use the Deployment with OpenVINO for deploying to target devices.

Tuning for Performance

The toolkit provides a Performance Optimization Guide and utilities for squeezing the best performance out of your application, including [Accuracy Checker](@ref omz_tools_accuracy_checker), [Post-Training Optimization Tool](@ref pot_README), and other tools for measuring accuracy, benchmarking performance, and tuning your application.

Graphical Web Interface for OpenVINO™ Toolkit

You can choose to use the [OpenVINO™ Deep Learning Workbench](@ref workbench_docs_Workbench_DG_Introduction), a web-based tool that guides you through the process of converting, measuring, optimizing, and deploying models. This tool also serves as a low-effort introduction to the toolkit and provides a variety of useful interactive charts for understanding performance.

Media Processing and Computer Vision Libraries

The OpenVINO™ toolkit also works with the following media processing frameworks and libraries:

  • [Intel® Deep Learning Streamer (Intel® DL Streamer)](@ref openvino_docs_dlstreamer) — A streaming media analytics framework based on GStreamer, for creating complex media analytics pipelines optimized for Intel hardware platforms. Go to the Intel® DL Streamer documentation website to learn more.
  • Intel® oneAPI Video Processing Library (oneVPL) — A programming interface for video decoding, encoding, and processing to build portable media pipelines on CPUs, GPUs, and other accelerators.

You can also add computer vision capabilities to your application using optimized versions of OpenCV.