[DOCS] Updating links for master

This commit is contained in:
Maciej Smyk 2023-11-13 12:23:23 +01:00 committed by GitHub
parent e79e5122f8
commit 5a04359200
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
46 changed files with 131 additions and 131 deletions

View File

@ -67,24 +67,24 @@ The OpenVINO™ Runtime can infer models on different hardware devices. This sec
<tbody>
<tr>
<td rowspan=2>CPU</td>
<td> <a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_CPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-c-p-u">Intel CPU</a></tb>
<td> <a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_CPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-c-p-u">Intel CPU</a></tb>
<td><b><i><a href="./src/plugins/intel_cpu">openvino_intel_cpu_plugin</a></i></b></td>
<td>Intel Xeon with Intel® Advanced Vector Extensions 2 (Intel® AVX2), Intel® Advanced Vector Extensions 512 (Intel® AVX-512), and AVX512_BF16, Intel Core Processors with Intel AVX2, Intel Atom Processors with Intel® Streaming SIMD Extensions (Intel® SSE), Intel® Advanced Matrix Extensions (Intel® AMX)</td>
</tr>
<tr>
<td> <a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_CPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-c-p-u">ARM CPU</a></tb>
<td> <a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_CPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-c-p-u">ARM CPU</a></tb>
<td><b><i><a href="./src/plugins/intel_cpu">openvino_arm_cpu_plugin</a></i></b></td>
<td>Raspberry Pi™ 4 Model B, Apple® Mac mini with Apple silicon
</tr>
<tr>
<td>GPU</td>
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_GPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-p-u">Intel GPU</a></td>
<td><a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_GPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-p-u">Intel GPU</a></td>
<td><b><i><a href="./src/plugins/intel_gpu">openvino_intel_gpu_plugin</a></i></b></td>
<td>Intel Processor Graphics, including Intel HD Graphics and Intel Iris Graphics</td>
</tr>
<tr>
<td>GNA</td>
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_GNA.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-n-a">Intel GNA</a></td>
<td><a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_GNA.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-n-a">Intel GNA</a></td>
<td><b><i><a href="./src/plugins/intel_gna">openvino_intel_gna_plugin</a></i></b></td>
<td>Intel Speech Enabling Developer Kit, Amazon Alexa* Premium Far-Field Developer Kit, Intel Pentium Silver J5005 Processor, Intel Pentium Silver N5000 Processor, Intel Celeron J4005 Processor, Intel Celeron J4105 Processor, Intel Celeron Processor N4100, Intel Celeron Processor N4000, Intel Core i3-8121U Processor, Intel Core i7-1065G7 Processor, Intel Core i7-1060G7 Processor, Intel Core i5-1035G4 Processor, Intel Core i5-1035G7 Processor, Intel Core i5-1035G1 Processor, Intel Core i5-1030G7 Processor, Intel Core i5-1030G4 Processor, Intel Core i3-1005G1 Processor, Intel Core i3-1000G1 Processor, Intel Core i3-1000G4 Processor</td>
</tr>
@ -102,22 +102,22 @@ OpenVINO™ Toolkit also contains several plugins which simplify loading models
</thead>
<tbody>
<tr>
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_AUTO.html">Auto</a></td>
<td><a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_AUTO.html">Auto</a></td>
<td><b><i><a href="./src/plugins/auto">openvino_auto_plugin</a></i></b></td>
<td>Auto plugin enables selecting Intel device for inference automatically</td>
</tr>
<tr>
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Automatic_Batching.html">Auto Batch</a></td>
<td><a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Automatic_Batching.html">Auto Batch</a></td>
<td><b><i><a href="./src/plugins/auto_batch">openvino_auto_batch_plugin</a></i></b></td>
<td>Auto batch plugin performs on-the-fly automatic batching (i.e. grouping inference requests together) to improve device utilization, with no programming effort from the user</td>
</tr>
<tr>
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Hetero_execution.html#doxid-openvino-docs-o-v-u-g-hetero-execution">Hetero</a></td>
<td><a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Hetero_execution.html#doxid-openvino-docs-o-v-u-g-hetero-execution">Hetero</a></td>
<td><b><i><a href="./src/plugins/hetero">openvino_hetero_plugin</a></i></b></td>
<td>Heterogeneous execution enables automatic inference splitting between several devices</td>
</tr>
<tr>
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Running_on_multiple_devices.html#doxid-openvino-docs-o-v-u-g-running-on-multiple-devices">Multi</a></td>
<td><a href="https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Running_on_multiple_devices.html#doxid-openvino-docs-o-v-u-g-running-on-multiple-devices">Multi</a></td>
<td><b><i><a href="./src/plugins/auto">openvino_auto_plugin</a></i></b></td>
<td>Multi plugin enables simultaneous inference of the same model on several devices in parallel</td>
</tr>
@ -164,9 +164,9 @@ The list of OpenVINO tutorials:
## System requirements
The system requirements vary depending on platform and are available on dedicated pages:
- [Linux](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_installing_openvino_linux_header.html)
- [Windows](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_installing_openvino_windows_header.html)
- [macOS](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_installing_openvino_macos_header.html)
- [Linux](https://docs.openvino.ai/2023.2/openvino_docs_install_guides_installing_openvino_linux_header.html)
- [Windows](https://docs.openvino.ai/2023.2/openvino_docs_install_guides_installing_openvino_windows_header.html)
- [macOS](https://docs.openvino.ai/2023.2/openvino_docs_install_guides_installing_openvino_macos_header.html)
## How to build
@ -206,6 +206,6 @@ Report questions, issues and suggestions, using:
\* Other names and brands may be claimed as the property of others.
[Open Model Zoo]:https://github.com/openvinotoolkit/open_model_zoo
[OpenVINO™ Runtime]:https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_OV_Runtime_User_Guide.html
[OpenVINO Model Converter (OVC)]:https://docs.openvino.ai/2023.1/openvino_docs_model_processing_introduction.html#convert-a-model-in-cli-ovc
[OpenVINO™ Runtime]:https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_OV_Runtime_User_Guide.html
[OpenVINO Model Converter (OVC)]:https://docs.openvino.ai/2023.2/openvino_docs_model_processing_introduction.html#convert-a-model-in-cli-ovc
[Samples]:https://github.com/openvinotoolkit/openvino/tree/master/samples

View File

@ -13,7 +13,7 @@
openvino_docs_performance_benchmarks_faq
OpenVINO Accuracy <openvino_docs_performance_int8_vs_fp32>
Performance Data Spreadsheet (download xlsx) <https://docs.openvino.ai/2023.1/_static/benchmarks_files/OV-2023.0-Performance-Data.xlsx>
Performance Data Spreadsheet (download xlsx) <https://docs.openvino.ai/2023.2/_static/benchmarks_files/OV-2023.0-Performance-Data.xlsx>
openvino_docs_MO_DG_Getting_Performance_Numbers

View File

@ -148,7 +148,7 @@ Please file a github Issue on these with the label “pre-release” so we can g
* PyTorch FE:
* Added support for 6 new operations. To know how to enjoy PyTorch models conversion follow
this `Link <https://docs.openvino.ai/2023.1/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_PyTorch.html#experimental-converting-a-pytorch-model-with-pytorch-frontend>`__
this `Link <https://docs.openvino.ai/2023.2/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_PyTorch.html#experimental-converting-a-pytorch-model-with-pytorch-frontend>`__
* aten::concat
* aten::masked_scatter

View File

@ -94,7 +94,7 @@ Detailed Guides
API References
##############
* `OpenVINO Plugin API <https://docs.openvino.ai/2023.1/groupov_dev_api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2023.1/groupie_transformation_api.html>`__
* `OpenVINO Plugin API <https://docs.openvino.ai/2023.2/groupov_dev_api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2023.2/groupie_transformation_api.html>`__
@endsphinxdirective

View File

@ -15,7 +15,7 @@
The guides below provides extra API references needed for OpenVINO plugin development:
* `OpenVINO Plugin API <https://docs.openvino.ai/2023.1/groupov_dev_api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2023.1/groupie_transformation_api.html>`__
* `OpenVINO Plugin API <https://docs.openvino.ai/2023.2/groupov_dev_api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2023.2/groupie_transformation_api.html>`__
@endsphinxdirective

View File

@ -10,7 +10,7 @@
This tutorial explains how to convert a RetinaNet model to the Intermediate Representation (IR).
`Public RetinaNet model <https://github.com/fizyr/keras-retinanet>`__ does not contain pretrained TensorFlow weights.
To convert this model to the TensorFlow format, follow the `Reproduce Keras to TensorFlow Conversion tutorial <https://docs.openvino.ai/2023.1/omz_models_model_retinanet_tf.html>`__.
To convert this model to the TensorFlow format, follow the `Reproduce Keras to TensorFlow Conversion tutorial <https://docs.openvino.ai/2023.2/omz_models_model_retinanet_tf.html>`__.
After converting the model to TensorFlow format, run the following command:

View File

@ -134,16 +134,16 @@ Now that you've installed OpenVINO Runtime, you're ready to run your own machine
.. image:: https://user-images.githubusercontent.com/15709723/127752390-f6aa371f-31b5-4846-84b9-18dd4f662406.gif
:width: 400
Try the `Python Quick Start Example <https://docs.openvino.ai/2023.1/notebooks/201-vision-monodepth-with-output.html>`__ to estimate depth in a scene using an OpenVINO monodepth model in a Jupyter Notebook inside your web browser.
Try the `Python Quick Start Example <https://docs.openvino.ai/2023.2/notebooks/201-vision-monodepth-with-output.html>`__ to estimate depth in a scene using an OpenVINO monodepth model in a Jupyter Notebook inside your web browser.
Get started with Python
+++++++++++++++++++++++
Visit the :doc:`Tutorials <tutorials>` page for more Jupyter Notebooks to get you started with OpenVINO, such as:
* `OpenVINO Python API Tutorial <https://docs.openvino.ai/2023.1/notebooks/002-openvino-api-with-output.html>`__
* `Basic image classification program with Hello Image Classification <https://docs.openvino.ai/2023.1/notebooks/001-hello-world-with-output.html>`__
* `Convert a PyTorch model and use it for image background removal <https://docs.openvino.ai/2023.1/notebooks/205-vision-background-removal-with-output.html>`__
* `OpenVINO Python API Tutorial <https://docs.openvino.ai/2023.2/notebooks/002-openvino-api-with-output.html>`__
* `Basic image classification program with Hello Image Classification <https://docs.openvino.ai/2023.2/notebooks/001-hello-world-with-output.html>`__
* `Convert a PyTorch model and use it for image background removal <https://docs.openvino.ai/2023.2/notebooks/205-vision-background-removal-with-output.html>`__

View File

@ -140,7 +140,7 @@ See Also
- :doc:`Using OpenVINO™ Samples <openvino_docs_OV_UG_Samples_Overview>`
- :doc:`Model Downloader <omz_tools_downloader>`
- :doc:`Convert a Model <openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide>`
- `C API Reference <https://docs.openvino.ai/2023.1/api/api_reference.html>`__
- `C API Reference <https://docs.openvino.ai/2023.2/api/api_reference.html>`__
@endsphinxdirective

View File

@ -45,17 +45,17 @@ The sample works with Kaldi ARK or Numpy* uncompressed NPZ files, so it does not
+-------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------+
| Feature | API | Description |
+===================================================================+================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================+=======================================================================+
| Import/Export Model | `openvino.runtime.Core.import_model <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.import_model>`__ , `openvino.runtime.CompiledModel.export_model <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.export_model>`__ | The GNA plugin supports loading and saving of the GNA-optimized model |
| Import/Export Model | `openvino.runtime.Core.import_model <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.import_model>`__ , `openvino.runtime.CompiledModel.export_model <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.export_model>`__ | The GNA plugin supports loading and saving of the GNA-optimized model |
+-------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------+
| Model Operations | `openvino.runtime.Model.add_outputs <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.add_outputs>`__ , `openvino.runtime.set_batch <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.html#openvino.runtime.set_batch>`__ , `openvino.runtime.CompiledModel.inputs <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.inputs>`__ , `openvino.runtime.CompiledModel.outputs <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.outputs>`__ , `openvino.runtime.ConstOutput.any_name <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.ConstOutput.html#openvino.runtime.ConstOutput.any_name>`__ | Managing of model: configure batch_size, input and output tensors |
| Model Operations | `openvino.runtime.Model.add_outputs <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.add_outputs>`__ , `openvino.runtime.set_batch <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.html#openvino.runtime.set_batch>`__ , `openvino.runtime.CompiledModel.inputs <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.inputs>`__ , `openvino.runtime.CompiledModel.outputs <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.outputs>`__ , `openvino.runtime.ConstOutput.any_name <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.ConstOutput.html#openvino.runtime.ConstOutput.any_name>`__ | Managing of model: configure batch_size, input and output tensors |
+-------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------+
| Synchronous Infer | `openvino.runtime.CompiledModel.create_infer_request <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.create_infer_request>`__ , `openvino.runtime.InferRequest.infer <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.infer>`__ | Do synchronous inference |
| Synchronous Infer | `openvino.runtime.CompiledModel.create_infer_request <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.create_infer_request>`__ , `openvino.runtime.InferRequest.infer <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.infer>`__ | Do synchronous inference |
+-------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------+
| InferRequest Operations | `openvino.runtime.InferRequest.get_input_tensor <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.get_input_tensor>`__ , `openvino.runtime.InferRequest.model_outputs <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.model_outputs>`__ , `openvino.runtime.InferRequest.model_inputs <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.model_inputs>`__ , | Get info about model using infer request API |
| InferRequest Operations | `openvino.runtime.InferRequest.get_input_tensor <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.get_input_tensor>`__ , `openvino.runtime.InferRequest.model_outputs <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.model_outputs>`__ , `openvino.runtime.InferRequest.model_inputs <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.model_inputs>`__ , | Get info about model using infer request API |
+-------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------+
| InferRequest Operations | `openvino.runtime.InferRequest.query_state <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.query_state>`__ , `openvino.runtime.VariableState.reset <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.inference_engine.VariableState.html#openvino.inference_engine.VariableState.reset>`__ | Gets and resets CompiledModel state control |
| InferRequest Operations | `openvino.runtime.InferRequest.query_state <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.query_state>`__ , `openvino.runtime.VariableState.reset <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.inference_engine.VariableState.html#openvino.inference_engine.VariableState.reset>`__ | Gets and resets CompiledModel state control |
+-------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------+
| Profiling | `openvino.runtime.InferRequest.profiling_info <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.profiling_info>`__ , `openvino.runtime.ProfilingInfo.real_time <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.ProfilingInfo.html#openvino.runtime.ProfilingInfo.real_time>`__ | Get infer request profiling info |
| Profiling | `openvino.runtime.InferRequest.profiling_info <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.profiling_info>`__ , `openvino.runtime.ProfilingInfo.real_time <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.ProfilingInfo.html#openvino.runtime.ProfilingInfo.real_time>`__ | Get infer request profiling info |
+-------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------+
Basic OpenVINO™ Runtime API is covered by :doc:`Hello Classification Python* Sample <openvino_inference_engine_ie_bridges_python_sample_hello_classification_README>`.

View File

@ -34,23 +34,23 @@ Models with only 1 input and output are supported.
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Feature | API | Description |
+=============================+===========================================================================================================================================================================================================================================+============================================================================================================================================================================================+
| Basic Infer Flow | `openvino.runtime.Core <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Core.html>`__ , | |
| | `openvino.runtime.Core.read_model <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.read_model>`__ , | |
| | `openvino.runtime.Core.compile_model <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.compile_model>`__ | Common API to do inference |
| Basic Infer Flow | `openvino.runtime.Core <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Core.html>`__ , | |
| | `openvino.runtime.Core.read_model <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.read_model>`__ , | |
| | `openvino.runtime.Core.compile_model <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.compile_model>`__ | Common API to do inference |
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Synchronous Infer | `openvino.runtime.CompiledModel.infer_new_request <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.infer_new_request>`__ | Do synchronous inference |
| Synchronous Infer | `openvino.runtime.CompiledModel.infer_new_request <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.CompiledModel.html#openvino.runtime.CompiledModel.infer_new_request>`__ | Do synchronous inference |
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Model Operations | `openvino.runtime.Model.inputs <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.inputs>`__ , | Managing of model |
| | `openvino.runtime.Model.outputs <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.outputs>`__ | |
| Model Operations | `openvino.runtime.Model.inputs <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.inputs>`__ , | Managing of model |
| | `openvino.runtime.Model.outputs <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.outputs>`__ | |
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Preprocessing | `openvino.preprocess.PrePostProcessor <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.PrePostProcessor.html>`__ , | Set image of the original size as input for a model with other input size. Resize and layout conversions will be performed automatically by the corresponding plugin just before inference |
| | `openvino.preprocess.InputTensorInfo.set_element_type <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.InputTensorInfo.html#openvino.preprocess.InputTensorInfo.set_element_type>`__ , | |
| | `openvino.preprocess.InputTensorInfo.set_layout <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.InputTensorInfo.html#openvino.preprocess.InputTensorInfo.set_layout>`__ , | |
| | `openvino.preprocess.InputTensorInfo.set_spatial_static_shape <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.InputTensorInfo.html#openvino.preprocess.InputTensorInfo.set_spatial_static_shape>`__ , | |
| | `openvino.preprocess.PreProcessSteps.resize <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.PreProcessSteps.html#openvino.preprocess.PreProcessSteps.resize>`__ , | |
| | `openvino.preprocess.InputModelInfo.set_layout <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.InputModelInfo.html#openvino.preprocess.InputModelInfo.set_layout>`__ , | |
| | `openvino.preprocess.OutputTensorInfo.set_element_type <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.OutputTensorInfo.html#openvino.preprocess.OutputTensorInfo.set_element_type>`__ , | |
| | `openvino.preprocess.PrePostProcessor.build <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.preprocess.PrePostProcessor.html#openvino.preprocess.PrePostProcessor.build>`__ | |
| Preprocessing | `openvino.preprocess.PrePostProcessor <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.PrePostProcessor.html>`__ , | Set image of the original size as input for a model with other input size. Resize and layout conversions will be performed automatically by the corresponding plugin just before inference |
| | `openvino.preprocess.InputTensorInfo.set_element_type <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.InputTensorInfo.html#openvino.preprocess.InputTensorInfo.set_element_type>`__ , | |
| | `openvino.preprocess.InputTensorInfo.set_layout <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.InputTensorInfo.html#openvino.preprocess.InputTensorInfo.set_layout>`__ , | |
| | `openvino.preprocess.InputTensorInfo.set_spatial_static_shape <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.InputTensorInfo.html#openvino.preprocess.InputTensorInfo.set_spatial_static_shape>`__ , | |
| | `openvino.preprocess.PreProcessSteps.resize <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.PreProcessSteps.html#openvino.preprocess.PreProcessSteps.resize>`__ , | |
| | `openvino.preprocess.InputModelInfo.set_layout <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.InputModelInfo.html#openvino.preprocess.InputModelInfo.set_layout>`__ , | |
| | `openvino.preprocess.OutputTensorInfo.set_element_type <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.OutputTensorInfo.html#openvino.preprocess.OutputTensorInfo.set_element_type>`__ , | |
| | `openvino.preprocess.PrePostProcessor.build <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.preprocess.PrePostProcessor.html#openvino.preprocess.PrePostProcessor.build>`__ | |
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
.. tab-item:: Sample Code

View File

@ -29,11 +29,11 @@ This sample demonstrates how to show OpenVINO™ Runtime devices and prints thei
+---------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------------+
| Feature | API | Description |
+=======================================+============================================================================================================================================================================================+========================================+
| Basic | `openvino.runtime.Core <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Core.html>`__ | Common API |
| Basic | `openvino.runtime.Core <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Core.html>`__ | Common API |
+---------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------------+
| Query Device | `openvino.runtime.Core.available_devices <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.available_devices>`__ , | Get device properties |
| | `openvino.runtime.Core.get_metric <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.inference_engine.IECore.html#openvino.inference_engine.IECore.get_metric>`__ , | |
| | `openvino.runtime.Core.get_config <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.inference_engine.IECore.html#openvino.inference_engine.IECore.get_config>`__ | |
| Query Device | `openvino.runtime.Core.available_devices <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Core.html#openvino.runtime.Core.available_devices>`__ , | Get device properties |
| | `openvino.runtime.Core.get_metric <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.inference_engine.IECore.html#openvino.inference_engine.IECore.get_metric>`__ , | |
| | `openvino.runtime.Core.get_config <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.inference_engine.IECore.html#openvino.inference_engine.IECore.get_config>`__ | |
+---------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------------+
.. tab-item:: Sample Code

View File

@ -37,10 +37,10 @@ Models with only 1 input and output are supported.
+------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------+
| Feature | API | Description |
+====================================+================================================================================================================================================================================+======================================+
| Model Operations | `openvino.runtime.Model.reshape <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.reshape>`__ , | Managing of model |
| | `openvino.runtime.Model.input <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.input>`__ , | |
| | `openvino.runtime.Output.get_any_name <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Output.html#openvino.runtime.Output.get_any_name>`__ , | |
| | `openvino.runtime.PartialShape <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.PartialShape.html>`__ | |
| Model Operations | `openvino.runtime.Model.reshape <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.reshape>`__ , | Managing of model |
| | `openvino.runtime.Model.input <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.input>`__ , | |
| | `openvino.runtime.Output.get_any_name <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Output.html#openvino.runtime.Output.get_any_name>`__ , | |
| | `openvino.runtime.PartialShape <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.PartialShape.html>`__ | |
+------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------------------------+
Basic OpenVINO™ Runtime API is covered by :doc:`Hello Classification Python* Sample <openvino_inference_engine_ie_bridges_python_sample_hello_classification_README>`.

View File

@ -34,11 +34,11 @@ Models with only 1 input and output are supported.
+--------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+---------------------------+
| Feature | API | Description |
+====================+===========================================================================================================================================================================================================+===========================+
| Asynchronous Infer | `openvino.runtime.AsyncInferQueue <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html>`__ , | Do asynchronous inference |
| | `openvino.runtime.AsyncInferQueue.set_callback <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html#openvino.runtime.AsyncInferQueue.set_callback>`__ , | |
| | `openvino.runtime.AsyncInferQueue.start_async <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html#openvino.runtime.AsyncInferQueue.start_async>`__ , | |
| | `openvino.runtime.AsyncInferQueue.wait_all <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html#openvino.runtime.AsyncInferQueue.wait_all>`__ , | |
| | `openvino.runtime.InferRequest.results <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.results>`__ | |
| Asynchronous Infer | `openvino.runtime.AsyncInferQueue <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html>`__ , | Do asynchronous inference |
| | `openvino.runtime.AsyncInferQueue.set_callback <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html#openvino.runtime.AsyncInferQueue.set_callback>`__ , | |
| | `openvino.runtime.AsyncInferQueue.start_async <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html#openvino.runtime.AsyncInferQueue.start_async>`__ , | |
| | `openvino.runtime.AsyncInferQueue.wait_all <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.AsyncInferQueue.html#openvino.runtime.AsyncInferQueue.wait_all>`__ , | |
| | `openvino.runtime.InferRequest.results <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.InferRequest.html#openvino.runtime.InferRequest.results>`__ | |
+--------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+---------------------------+
Basic OpenVINO™ Runtime API is covered by :doc:`Hello Classification Python Sample <openvino_inference_engine_ie_bridges_python_sample_hello_classification_README>`.

View File

@ -33,19 +33,19 @@ This sample demonstrates how to run inference using a :doc:`model <openvino_docs
+------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------+
| Feature | API | Description |
+==========================================+==============================================================================================================================================================+====================================================================================+
| Model Operations | `openvino.runtime.Model <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Model.html>`__ , | Managing of model |
| | `openvino.runtime.set_batch <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.set_batch.html>`__ , | |
| | `openvino.runtime.Model.input <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.input>`__ | |
| Model Operations | `openvino.runtime.Model <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Model.html>`__ , | Managing of model |
| | `openvino.runtime.set_batch <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.set_batch.html>`__ , | |
| | `openvino.runtime.Model.input <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.Model.html#openvino.runtime.Model.input>`__ | |
+------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------+
| Opset operations | `openvino.runtime.op.Parameter <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.op.Parameter.html>`__ , | Description of a model topology using OpenVINO Python API |
| | `openvino.runtime.op.Constant <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.op.Constant.html>`__ , | |
| | `openvino.runtime.opset8.convolution <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.opset8.convolution.html>`__ , | |
| | `openvino.runtime.opset8.add <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.opset8.add.html>`__ , | |
| | `openvino.runtime.opset1.max_pool <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.opset1.max_pool.html>`__ , | |
| | `openvino.runtime.opset8.reshape <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.opset8.reshape.html>`__ , | |
| | `openvino.runtime.opset8.matmul <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.opset8.matmul.html>`__ , | |
| | `openvino.runtime.opset8.relu <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.opset8.relu.html>`__ , | |
| | `openvino.runtime.opset8.softmax <https://docs.openvino.ai/2023.1/api/ie_python_api/_autosummary/openvino.runtime.opset8.softmax.html>`__ | |
| Opset operations | `openvino.runtime.op.Parameter <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.op.Parameter.html>`__ , | Description of a model topology using OpenVINO Python API |
| | `openvino.runtime.op.Constant <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.op.Constant.html>`__ , | |
| | `openvino.runtime.opset8.convolution <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.opset8.convolution.html>`__ , | |
| | `openvino.runtime.opset8.add <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.opset8.add.html>`__ , | |
| | `openvino.runtime.opset1.max_pool <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.opset1.max_pool.html>`__ , | |
| | `openvino.runtime.opset8.reshape <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.opset8.reshape.html>`__ , | |
| | `openvino.runtime.opset8.matmul <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.opset8.matmul.html>`__ , | |
| | `openvino.runtime.opset8.relu <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.opset8.relu.html>`__ , | |
| | `openvino.runtime.opset8.softmax <https://docs.openvino.ai/2023.2/api/ie_python_api/_autosummary/openvino.runtime.opset8.softmax.html>`__ | |
+------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------+
Basic OpenVINO™ Runtime API is covered by :doc:`Hello Classification Python* Sample <openvino_inference_engine_ie_bridges_python_sample_hello_classification_README>`.

View File

@ -60,7 +60,7 @@ Below are example-codes for the regular and async-based approaches to compare:
The technique can be generalized to any available parallel slack. For example, you can do inference and simultaneously encode the resulting or previous frames or run further inference, like emotion detection on top of the face detection results.
Refer to the `Object Detection C++ Demo <https://docs.openvino.ai/2023.1/omz_demos_object_detection_demo_cpp.html>`__ , `Object Detection Python Demo <https://docs.openvino.ai/2023.1/omz_demos_object_detection_demo_python.html>`__ (latency-oriented Async API showcase) and :doc:`Benchmark App Sample <openvino_inference_engine_samples_benchmark_app_README>` for complete examples of the Async API in action.
Refer to the `Object Detection C++ Demo <https://docs.openvino.ai/2023.2/omz_demos_object_detection_demo_cpp.html>`__ , `Object Detection Python Demo <https://docs.openvino.ai/2023.2/omz_demos_object_detection_demo_python.html>`__ (latency-oriented Async API showcase) and :doc:`Benchmark App Sample <openvino_inference_engine_samples_benchmark_app_README>` for complete examples of the Async API in action.
.. note::

View File

@ -110,8 +110,8 @@ Additional Resources
* :doc:`Layout API overview <openvino_docs_OV_UG_Layout_Overview>`
* :doc:`Model Optimizer - Optimize Preprocessing Computation <openvino_docs_MO_DG_Additional_Optimization_Use_Cases>`
* :doc:`Model Caching Overview <openvino_docs_OV_UG_Model_caching_overview>`
* The `ov::preprocess::PrePostProcessor <https://docs.openvino.ai/2023.1/classov_1_1preprocess_1_1PrePostProcessor.html#doxid-classov-1-1preprocess-1-1-pre-post-processor>`__ C++ class documentation
* The `ov::pass::Serialize <https://docs.openvino.ai/2023.1/classov_1_1pass_1_1Serialize.html#doxid-classov-1-1pass-1-1-serialize.html>`__ - pass to serialize model to XML/BIN
* The `ov::set_batch <https://docs.openvino.ai/2023.1/namespaceov.html#doxid-namespaceov-1a3314e2ff91fcc9ffec05b1a77c37862b.html>`__ - update batch dimension for a given model
* The `ov::preprocess::PrePostProcessor <https://docs.openvino.ai/2023.2/classov_1_1preprocess_1_1PrePostProcessor.html#doxid-classov-1-1preprocess-1-1-pre-post-processor>`__ C++ class documentation
* The `ov::pass::Serialize <https://docs.openvino.ai/2023.2/classov_1_1pass_1_1Serialize.html#doxid-classov-1-1pass-1-1-serialize.html>`__ - pass to serialize model to XML/BIN
* The `ov::set_batch <https://docs.openvino.ai/2023.2/namespaceov.html#doxid-namespaceov-1a3314e2ff91fcc9ffec05b1a77c37862b.html>`__ - update batch dimension for a given model
@endsphinxdirective

View File

@ -437,9 +437,9 @@ To build your project using CMake with the default build tools currently availab
Additional Resources
####################
* See the :doc:`OpenVINO Samples <openvino_docs_OV_UG_Samples_Overview>` page or the `Open Model Zoo Demos <https://docs.openvino.ai/2023.1/omz_demos.html>`__ page for specific examples of how OpenVINO pipelines are implemented for applications like image classification, text prediction, and many others.
* See the :doc:`OpenVINO Samples <openvino_docs_OV_UG_Samples_Overview>` page or the `Open Model Zoo Demos <https://docs.openvino.ai/2023.2/omz_demos.html>`__ page for specific examples of how OpenVINO pipelines are implemented for applications like image classification, text prediction, and many others.
* :doc:`OpenVINO™ Runtime Preprocessing <openvino_docs_OV_UG_Preprocessing_Overview>`
* :doc:`Using Encrypted Models with OpenVINO <openvino_docs_OV_UG_protecting_model_guide>`
* `Open Model Zoo Demos <https://docs.openvino.ai/2023.1/omz_demos.html>`__
* `Open Model Zoo Demos <https://docs.openvino.ai/2023.2/omz_demos.html>`__
@endsphinxdirective

View File

@ -62,7 +62,7 @@ Model input dimensions can be specified as dynamic using the model.reshape metho
Some models may already have dynamic shapes out of the box and do not require additional configuration. This can either be because it was generated with dynamic shapes from the source framework, or because it was converted with Model Conversion API to use dynamic shapes. For more information, see the Dynamic Dimensions “Out of the Box” section.
The examples below show how to set dynamic dimensions with a model that has a static ``[1, 3, 224, 224]`` input shape (such as `mobilenet-v2 <https://docs.openvino.ai/2023.1/omz_models_model_mobilenet_v2.html>`__). The first example shows how to change the first dimension (batch size) to be dynamic. In the second example, the third and fourth dimensions (height and width) are set as dynamic.
The examples below show how to set dynamic dimensions with a model that has a static ``[1, 3, 224, 224]`` input shape (such as `mobilenet-v2 <https://docs.openvino.ai/2023.2/omz_models_model_mobilenet_v2.html>`__). The first example shows how to change the first dimension (batch size) to be dynamic. In the second example, the third and fourth dimensions (height and width) are set as dynamic.
.. tab-set::
@ -175,7 +175,7 @@ The lower and/or upper bounds of a dynamic dimension can also be specified. They
.. tab-item:: C
:sync: c
The dimension bounds can be coded as arguments for `ov_dimension <https://docs.openvino.ai/2023.1/structov_dimension.html#doxid-structov-dimension>`__, as shown in these examples:
The dimension bounds can be coded as arguments for `ov_dimension <https://docs.openvino.ai/2023.2/structov_dimension.html#doxid-structov-dimension>`__, as shown in these examples:
.. doxygensnippet:: docs/snippets/ov_dynamic_shapes.c
:language: cpp

View File

@ -190,8 +190,8 @@ In this case OpenVINO CMake scripts take `TBBROOT` environment variable into acc
[pugixml]:https://pugixml.org/
[ONNX]:https://onnx.ai/
[protobuf]:https://github.com/protocolbuffers/protobuf
[deployment manager]:https://docs.openvino.ai/2023.1/openvino_docs_install_guides_deployment_manager_tool.html
[OpenVINO Runtime Introduction]:https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Integrate_OV_with_your_application.html
[deployment manager]:https://docs.openvino.ai/2023.2/openvino_docs_install_guides_deployment_manager_tool.html
[OpenVINO Runtime Introduction]:https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Integrate_OV_with_your_application.html
[PDPD]:https://github.com/PaddlePaddle/Paddle
[TensorFlow]:https://www.tensorflow.org/
[TensorFlow Lite]:https://www.tensorflow.org/lite

View File

@ -2,7 +2,7 @@
OpenVINO components provides different debug capabilities, to get more information please read:
* [OpenVINO Model Debug Capabilities](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Model_Representation.html#model-debug-capabilities)
* [OpenVINO Model Debug Capabilities](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Model_Representation.html#model-debug-capabilities)
* [OpenVINO Pass Manager Debug Capabilities](#todo)
## See also

View File

@ -25,13 +25,13 @@ OpenVINO 2023.0
<li class="splide__slide">An open-source toolkit for optimizing and deploying deep learning models.<br>Boost your AI deep-learning inference performance!</li>
<li class="splide__slide"Better OpenVINO integration with PyTorch!<br>Use PyTorch models directly, without converting them first.<br>
<a href="https://docs.openvino.ai/2023.1/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_PyTorch.html">Learn more...</a>
<a href="https://docs.openvino.ai/2023.2/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_PyTorch.html">Learn more...</a>
</li>
<li class="splide__slide">OpenVINO via PyTorch 2.0 torch.compile()<br>Use OpenVINO directly in PyTorch-native applications!<br>
<a href="https://docs.openvino.ai/2023.1/pytorch_2_0_torch_compile.html">Learn more...</a>
<a href="https://docs.openvino.ai/2023.2/pytorch_2_0_torch_compile.html">Learn more...</a>
</li>
<li class="splide__slide">Do you like Generative AI? You will love how it performs with OpenVINO!<br>
<a href="https://docs.openvino.ai/2023.1/tutorials.html">Check out our new notebooks...</a>
<a href="https://docs.openvino.ai/2023.2/tutorials.html">Check out our new notebooks...</a>
</ul>
</div>
</section>

View File

@ -3,7 +3,7 @@
<!--- The note below is intended for master branch only for pre-release purpose. Remove it for official releases. --->
> **NOTE**: This version is pre-release software and has not undergone full release validation or qualification. No support is offered on pre-release software and APIs/behavior are subject to change. It should NOT be incorporated into any production software/solution and instead should be used only for early testing and integration while awaiting a final release version of this software.
> **NOTE**: OpenVINO™ Development Tools package has been deprecated and will be discontinued with 2024.0 release. To learn more, refer to the [OpenVINO Legacy Features and Components page](https://docs.openvino.ai/2023.1/openvino_legacy_features.html).
> **NOTE**: OpenVINO™ Development Tools package has been deprecated and will be discontinued with 2024.0 release. To learn more, refer to the [OpenVINO Legacy Features and Components page](https://docs.openvino.ai/2023.2/openvino_legacy_features.html).
Intel® Distribution of OpenVINO™ toolkit is an open-source toolkit for optimizing and deploying AI inference. It can be used to develop applications and solutions based on deep learning tasks, such as: emulation of human vision, automatic speech recognition, natural language processing, recommendation systems, etc. It provides high-performance and rich deployment options, from edge to cloud.
@ -128,7 +128,7 @@ For example, to install and configure the components for working with TensorFlow
## Troubleshooting
For general troubleshooting steps and issues, see [Troubleshooting Guide for OpenVINO Installation](https://docs.openvino.ai/2023.1/openvino_docs_get_started_guide_troubleshooting.html). The following sections also provide explanations to several error messages.
For general troubleshooting steps and issues, see [Troubleshooting Guide for OpenVINO Installation](https://docs.openvino.ai/2023.2/openvino_docs_get_started_guide_troubleshooting.html). The following sections also provide explanations to several error messages.
### Errors with Installing via PIP for Users in China

View File

@ -5,7 +5,7 @@
Intel® Distribution of OpenVINO™ toolkit is an open-source toolkit for optimizing and deploying AI inference. It can be used to develop applications and solutions based on deep learning tasks, such as: emulation of human vision, automatic speech recognition, natural language processing, recommendation systems, etc. It provides high-performance and rich deployment options, from edge to cloud.
If you have already finished developing your models and converting them to the OpenVINO model format, you can install OpenVINO Runtime to deploy your applications on various devices. The [OpenVINO™](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_OV_Runtime_User_Guide.html) Python package includes a set of libraries for an easy inference integration with your products.
If you have already finished developing your models and converting them to the OpenVINO model format, you can install OpenVINO Runtime to deploy your applications on various devices. The [OpenVINO™](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_OV_Runtime_User_Guide.html) Python package includes a set of libraries for an easy inference integration with your products.
## System Requirements
@ -75,13 +75,13 @@ If installation was successful, you will see the list of available devices.
| Component | Content | Description |
|------------------|---------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [OpenVINO Runtime](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_OV_Runtime_User_Guide.html) | `openvino package` |**OpenVINO Runtime** is a set of C++ libraries with C and Python bindings providing a common API to deliver inference solutions on the platform of your choice. Use the OpenVINO Runtime API to read PyTorch\*, TensorFlow\*, TensorFlow Lite\*, ONNX\*, and PaddlePaddle\* models and execute them on preferred devices. OpenVINO Runtime uses a plugin architecture and includes the following plugins: [CPU](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_CPU.html), [GPU](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_GPU.html), [Auto Batch](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Automatic_Batching.html), [Auto](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_AUTO.html), [Hetero](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Hetero_execution.html).
| [OpenVINO Model Converter (OVC)](https://docs.openvino.ai/2023.1/openvino_docs_model_processing_introduction.html#convert-a-model-in-cli-ovc) | `ovc` |**OpenVINO Model Converter** converts models that were trained in popular frameworks to a format usable by OpenVINO components. <br>Supported frameworks include ONNX\*, TensorFlow\*, TensorFlow Lite\*, and PaddlePaddle\*. |
| [Benchmark Tool](https://docs.openvino.ai/2023.1/openvino_inference_engine_tools_benchmark_tool_README.html)| `benchmark_app` | **Benchmark Application** allows you to estimate deep learning inference performance on supported devices for synchronous and asynchronous modes. |
| [OpenVINO Runtime](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_OV_Runtime_User_Guide.html) | `openvino package` |**OpenVINO Runtime** is a set of C++ libraries with C and Python bindings providing a common API to deliver inference solutions on the platform of your choice. Use the OpenVINO Runtime API to read PyTorch\*, TensorFlow\*, TensorFlow Lite\*, ONNX\*, and PaddlePaddle\* models and execute them on preferred devices. OpenVINO Runtime uses a plugin architecture and includes the following plugins: [CPU](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_CPU.html), [GPU](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_GPU.html), [Auto Batch](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Automatic_Batching.html), [Auto](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_supported_plugins_AUTO.html), [Hetero](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Hetero_execution.html).
| [OpenVINO Model Converter (OVC)](https://docs.openvino.ai/2023.2/openvino_docs_model_processing_introduction.html#convert-a-model-in-cli-ovc) | `ovc` |**OpenVINO Model Converter** converts models that were trained in popular frameworks to a format usable by OpenVINO components. <br>Supported frameworks include ONNX\*, TensorFlow\*, TensorFlow Lite\*, and PaddlePaddle\*. |
| [Benchmark Tool](https://docs.openvino.ai/2023.2/openvino_inference_engine_tools_benchmark_tool_README.html)| `benchmark_app` | **Benchmark Application** allows you to estimate deep learning inference performance on supported devices for synchronous and asynchronous modes. |
## Troubleshooting
For general troubleshooting steps and issues, see [Troubleshooting Guide for OpenVINO Installation](https://docs.openvino.ai/2023.1/openvino_docs_get_started_guide_troubleshooting.html). The following sections also provide explanations to several error messages.
For general troubleshooting steps and issues, see [Troubleshooting Guide for OpenVINO Installation](https://docs.openvino.ai/2023.2/openvino_docs_get_started_guide_troubleshooting.html). The following sections also provide explanations to several error messages.
### Errors with Installing via PIP for Users in China

View File

@ -59,7 +59,7 @@ OpenVINO provides bindings for different languages. To get the full list of supp
## Core developer topics
* [OpenVINO architecture](./docs/architecture.md)
* [Plugin Development](https://docs.openvino.ai/2023.1/openvino_docs_ie_plugin_dg_overview.html)
* [Plugin Development](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html)
* [Thread safety](#todo)
* [Performance](#todo)

View File

@ -25,7 +25,7 @@ People from the [openvino-c-api-maintainers](https://github.com/orgs/openvinotoo
OpenVINO C API has the following structure:
* [docs](./docs) contains developer documentation for OpenVINO C APIs.
* [include](./include) contains all provided C API headers. [Learn more](https://docs.openvino.ai/2023.1/api/api_reference.html).
* [include](./include) contains all provided C API headers. [Learn more](https://docs.openvino.ai/2023.2/api/api_reference.html).
* [src](./src) contains the implementations of all C APIs.
* [tests](./tests) contains all tests for OpenVINO C APIs. [Learn more](./docs/how_to_write_unit_test.md).
@ -33,7 +33,7 @@ OpenVINO C API has the following structure:
## Tutorials
* [How to integrate OpenVINO C API with Your Application](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Integrate_OV_with_your_application.html)
* [How to integrate OpenVINO C API with Your Application](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Integrate_OV_with_your_application.html)
* [How to wrap OpenVINO objects with C](./docs/how_to_wrap_openvino_objects_with_c.md)
* [How to wrap OpenVINO interfaces with C](./docs/how_to_wrap_openvino_interfaces_with_c.md)
* [Samples implemented by OpenVINO C API](../../../samples/c/)
@ -47,5 +47,5 @@ See [CONTRIBUTING](../../../CONTRIBUTING.md) for details.
## See also
* [OpenVINO™ README](../../../README.md)
* [OpenVINO Runtime C API User Guide](https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Integrate_OV_with_your_application.html)
* [Migration of OpenVINO C API](https://docs.openvino.ai/2023.1/openvino_2_0_transition_guide.html)
* [OpenVINO Runtime C API User Guide](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Integrate_OV_with_your_application.html)
* [Migration of OpenVINO C API](https://docs.openvino.ai/2023.2/openvino_2_0_transition_guide.html)

View File

@ -78,4 +78,4 @@ The tensor create needs to specify the shape info, so C shape need to be convert
## See also
* [OpenVINO™ README](../../../../README.md)
* [C API developer guide](../README.md)
* [C API Reference](https://docs.openvino.ai/2023.1/api/api_reference.html)
* [C API Reference](https://docs.openvino.ai/2023.2/api/api_reference.html)

View File

@ -73,4 +73,4 @@ https://github.com/openvinotoolkit/openvino/blob/d96c25844d6cfd5ad131539c8a09282
## See also
* [OpenVINO™ README](../../../../README.md)
* [C API developer guide](../README.md)
* [C API Reference](https://docs.openvino.ai/2023.1/api/api_reference.html)
* [C API Reference](https://docs.openvino.ai/2023.2/api/api_reference.html)

View File

@ -14,5 +14,5 @@ https://github.com/openvinotoolkit/openvino/blob/d96c25844d6cfd5ad131539c8a09282
## See also
* [OpenVINO™ README](../../../../README.md)
* [C API developer guide](../README.md)
* [C API Reference](https://docs.openvino.ai/2023.1/api/api_reference.html)
* [C API Reference](https://docs.openvino.ai/2023.2/api/api_reference.html)

View File

@ -42,8 +42,8 @@ If you want to contribute to OpenVINO Python API, here is the list of learning m
* [OpenVINO™ README](../../../README.md)
* [OpenVINO™ Core Components](../../README.md)
* [OpenVINO™ Python API Reference](https://docs.openvino.ai/2023.0/api/ie_python_api/api.html)
* [OpenVINO™ Python API Advanced Inference](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Python_API_inference.html)
* [OpenVINO™ Python API Exclusives](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Python_API_exclusives.html)
* [OpenVINO™ Python API Reference](https://docs.openvino.ai/2023.2/api/ie_python_api/api.html)
* [OpenVINO™ Python API Advanced Inference](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Python_API_inference.html)
* [OpenVINO™ Python API Exclusives](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Python_API_exclusives.html)
* [pybind11 repository](https://github.com/pybind/pybind11)
* [pybind11 documentation](https://pybind11.readthedocs.io/en/stable/)

View File

@ -2,7 +2,7 @@
OpenVINO Core is a part of OpenVINO Runtime library.
The component is responsible for:
* Model representation - component provides classes for manipulation with models inside the OpenVINO Runtime. For more information please read [Model representation in OpenVINO Runtime User Guide](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Model_Representation.html)
* Model representation - component provides classes for manipulation with models inside the OpenVINO Runtime. For more information please read [Model representation in OpenVINO Runtime User Guide](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Model_Representation.html)
* Operation representation - contains all from the box supported OpenVINO operations and opsets. For more information read [Operations enabling flow guide](./docs/operation_enabling_flow.md).
* Model modification - component provides base classes which allow to develop transformation passes for model modification. For more information read [Transformation enabling flow guide](#todo).
@ -27,7 +27,7 @@ OpenVINO Core has the next structure:
## Tutorials
* [How to add new operations](./docs/operation_enabling_flow.md).
* [How to add OpenVINO Extension](https://docs.openvino.ai/2023.0/openvino_docs_Extensibility_UG_Intro.html). This document is based on the [template_extension](./template_extension/new/).
* [How to add OpenVINO Extension](https://docs.openvino.ai/2023.2/openvino_docs_Extensibility_UG_Intro.html). This document is based on the [template_extension](./template_extension/new/).
* [How to debug the component](./docs/debug_capabilities.md).
## See also

View File

@ -18,7 +18,7 @@ OpenVINO Core API contains two folders:
## Main structures for model representation
* `ov::Model` is located in [openvino/core/model.hpp](../include/openvino/core/model.hpp) and provides API for model representation. For more details, read [OpenVINO Model Representation Guide](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Model_Representation.html).
* `ov::Model` is located in [openvino/core/model.hpp](../include/openvino/core/model.hpp) and provides API for model representation. For more details, read [OpenVINO Model Representation Guide](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Model_Representation.html).
* `ov::Node` is a base class for all OpenVINO operations, the class is located in the [openvino/core/node.hpp](../include/openvino/core/node.hpp).
* `ov::Shape` and `ov::PartialShape` classes represent shapes in OpenVINO, these classes are located in the [openvino/core/shape.hpp](../include/openvino/core/shape.hpp) and [openvino/core/partial_shape.hpp](../include/openvino/core/partial_shape.hpp) respectively. For more information, read [OpenVINO Shapes representation](./shape_propagation.md#openvino-shapes-representation).
* `ov::element::Type` class represents element type for OpenVINO Tensors and Operations. The class is located in the [openvino/core/type/element_type.hpp](../include/openvino/core/type/element_type.hpp).

View File

@ -2,7 +2,7 @@
OpenVINO Core contains a set of different debug capabilities that make developer life easier by collecting information about object statuses during OpenVINO Runtime execution and reporting this information to the developer.
* OpenVINO Model debug capabilities are described in the [OpenVINO Model User Guide](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Model_Representation.html#model-debug-capabilities).
* OpenVINO Model debug capabilities are described in the [OpenVINO Model User Guide](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Model_Representation.html#model-debug-capabilities).
## See also
* [OpenVINO™ Core README](../README.md)

View File

@ -21,7 +21,7 @@ OpenVINO Paddle Frontend has the following structure:
## Debug capabilities
Developers can use OpenVINO Model debug capabilities that are described in the [OpenVINO Model User Guide](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Model_Representation.html#model-debug-capabilities).
Developers can use OpenVINO Model debug capabilities that are described in the [OpenVINO Model User Guide](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Model_Representation.html#model-debug-capabilities).
## Tutorials

View File

@ -115,7 +115,7 @@ In rare cases, converting PyTorch operations requires transformation. The main
difference between transformation and translation is that transformation works on the graph rather
than on the `NodeContext` of a single operation. This means that some functionality
provided by `NodeContext` is not accessible in transformation and usually
requires working with `PtFramworkNode` directly. [General rules](https://docs.openvino.ai/2023.1/openvino_docs_transformations.html)
requires working with `PtFramworkNode` directly. [General rules](https://docs.openvino.ai/2023.2/openvino_docs_transformations.html)
for writing transformations also apply to PT FE transformations.
### PyTorch Frontend Layer Tests

View File

@ -140,15 +140,15 @@ The main rules for loaders implementation:
In rare cases, TensorFlow operation conversion requires two transformations (`Loader` and `Internal Transformation`).
In the first step, `Loader` must convert a TF operation into [Internal Operation](../tensorflow_common/helper_ops) that is used temporarily by the conversion pipeline.
The internal operation implementation must also contain the `validate_and_infer_types()` method as similar to [OpenVINO Core](https://docs.openvino.ai/2023.0/groupov_ops_cpp_api.html) operations.
The internal operation implementation must also contain the `validate_and_infer_types()` method as similar to [OpenVINO Core](https://docs.openvino.ai/2023.2/groupov_ops_cpp_api.html) operations.
Here is an example of an implementation for the internal operation `SparseFillEmptyRows` used to convert Wide and Deep models.
https://github.com/openvinotoolkit/openvino/blob/7f3c95c161bc78ab2aefa6eab8b008142fb945bc/src/frontends/tensorflow/src/helper_ops/sparse_fill_empty_rows.hpp#L17-L55
In the second step, `Internal Transformation` based on `ov::pass::MatcherPass` must convert sub-graphs with internal operations into sub-graphs consisting only of the OpenVINO opset.
For more information about `ov::pass::MatcherPass` based transformations and their development, read [Overview of Transformations API](https://docs.openvino.ai/2023.0/openvino_docs_transformations.html)
and [OpenVINO Matcher Pass](https://docs.openvino.ai/2023.0/openvino_docs_Extensibility_UG_matcher_pass.html) documentation.
For more information about `ov::pass::MatcherPass` based transformations and their development, read [Overview of Transformations API](https://docs.openvino.ai/2023.2/openvino_docs_transformations.html)
and [OpenVINO Matcher Pass](https://docs.openvino.ai/2023.2/openvino_docs_Extensibility_UG_matcher_pass.html) documentation.
The internal transformation must be called in the `ov::frontend::tensorflow::FrontEnd::normalize()` method.
It is important to check the order of applying internal transformations to avoid situations when some internal operation
breaks a graph pattern with an internal operation for another internal transformation.

View File

@ -9,12 +9,12 @@ OpenVINO Inference API contains two folders:
Public OpenVINO Inference API defines global header [openvino/openvino.hpp](../include/openvino/openvino.hpp) which includes all common OpenVINO headers.
All Inference components are placed inside the [openvino/runtime](../include/openvino/runtime) folder.
To learn more about the Inference API usage, read [How to integrate OpenVINO with your application](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Integrate_OV_with_your_application.html).
To learn more about the Inference API usage, read [How to integrate OpenVINO with your application](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Integrate_OV_with_your_application.html).
The diagram with dependencies is presented on the [OpenVINO Architecture page](../../docs/architecture.md#openvino-inference-pipeline).
## Components of OpenVINO Developer API
OpenVINO Developer API is required for OpenVINO plugin development. This process is described in the [OpenVINO Plugin Development Guide](https://docs.openvino.ai/2023.0/openvino_docs_ie_plugin_dg_overview.html).
OpenVINO Developer API is required for OpenVINO plugin development. This process is described in the [OpenVINO Plugin Development Guide](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html).
## See also
* [OpenVINO™ Core README](../README.md)

View File

@ -20,7 +20,7 @@ The AUTO plugin follows the OpenVINO™ plugin architecture and consists of seve
* [src](./src/) - folder contains sources of the AUTO plugin.
* [tests](./tests/) - tests for Auto Plugin components.
Learn more in the [OpenVINO™ Plugin Developer Guide](https://docs.openvino.ai/2023.0/openvino_docs_ie_plugin_dg_overview.html).
Learn more in the [OpenVINO™ Plugin Developer Guide](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html).
## Architecture
The diagram below shows an overview of the components responsible for the basic inference flow:

View File

@ -8,8 +8,8 @@ AUTO is a meta plugin in OpenVINO that doesnt bind to a specific type of hard
The logic behind the choice is as follows:
* Check what supported devices are available.
* Check performance hint of input setting (For detailed information of performance hint, please read more on the [ov::hint::PerformanceMode](https://docs.openvino.ai/2023.0/openvino_docs_OV_UG_Performance_Hints.html)).
* Check precisions of the input model (for detailed information on precisions read more on the [ov::device::capabilities](https://docs.openvino.ai/2023.0/namespaceov_1_1device_1_1capability.html)).
* Check performance hint of input setting (For detailed information of performance hint, please read more on the [ov::hint::PerformanceMode](https://docs.openvino.ai/2023.2/openvino_docs_OV_UG_Performance_Hints.html)).
* Check precisions of the input model (for detailed information on precisions read more on the [ov::device::capabilities](https://docs.openvino.ai/2023.2/namespaceov_1_1device_1_1capability.html)).
* Select the highest-priority device capable of supporting the given model for LATENCY hint and THROUGHPUT hint. Or Select all devices capable of supporting the given model for CUMULATIVE THROUGHPUT hint.
* If models precision is FP32 but there is no device capable of supporting it, offload the model to a device supporting FP16.
@ -21,7 +21,7 @@ The AUTO plugin is also the default plugin for OpenVINO, if the user does not se
Compiling the model to accelerator-optimized kernels may take some time. When AUTO selects one accelerator, it can start inference with the system's CPU by default, as it provides very low latency and can start inference with no additional delays. While the CPU is performing inference, AUTO continues to load the model to the device best suited for the purpose and transfers the task to it when ready.
![alt text](https://docs.openvino.ai/2023.0/_images/autoplugin_accelerate.svg "AUTO cuts first inference latency (FIL) by running inference on the CPU until the GPU is ready")
![alt text](https://docs.openvino.ai/2023.2/_images/autoplugin_accelerate.svg "AUTO cuts first inference latency (FIL) by running inference on the CPU until the GPU is ready")
The user can disable this acceleration feature by excluding CPU from the priority list or disabling `ov::intel_auto::enable_startup_fallback`. Its default value is `true`.

View File

@ -1,7 +1,7 @@
# AUTO Plugin Integration
## Implement a New Plugin
Refer to [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.1/openvino_docs_ie_plugin_dg_overview.html) for detailed information on how to implement a new plugin.
Refer to [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html) for detailed information on how to implement a new plugin.
Query model method `ov::IPlugin::query_model()` is recommended as it is important for AUTO to quickly make decisions and save selection time.

View File

@ -1,5 +1,5 @@
# FakeQuantize in OpenVINO
https://docs.openvino.ai/2023.0/openvino_docs_ops_quantization_FakeQuantize_1.html
https://docs.openvino.ai/2023.2/openvino_docs_ops_quantization_FakeQuantize_1.html
definition:
```

View File

@ -3,7 +3,7 @@
The CPU plugin supports several graph optimization algorithms, such as fusing or removing layers.
Refer to the sections below for details.
> **NOTE**: For layer descriptions, see the [IR Notation Reference](https://docs.openvino.ai/2023.0/openvino_docs_ops_opset.html).
> **NOTE**: For layer descriptions, see the [IR Notation Reference](https://docs.openvino.ai/2023.2/openvino_docs_ops_opset.html).
## Fusing Convolution and Simple Layers

View File

@ -28,7 +28,7 @@ Some Intel® CPUs might not have integrated GPU, so if you want to run OpenVINO
## 2. Make sure that OpenCL® Runtime is installed
OpenCL runtime is a part of the GPU driver on Windows, but on Linux it should be installed separately. For the installation tips, refer to [OpenVINO docs](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_linux_header.html) and [OpenCL Compute Runtime docs](https://github.com/intel/compute-runtime/tree/master/opencl/doc).
OpenCL runtime is a part of the GPU driver on Windows, but on Linux it should be installed separately. For the installation tips, refer to [OpenVINO docs](https://docs.openvino.ai/2023.2/openvino_docs_install_guides_installing_openvino_linux_header.html) and [OpenCL Compute Runtime docs](https://github.com/intel/compute-runtime/tree/master/opencl/doc).
To get the support of Intel® Iris® Xe MAX Graphics with Linux, follow the [driver installation guide](https://dgpu-docs.intel.com/devices/iris-xe-max-graphics/index.html)
## 3. Make sure that user has all required permissions to work with GPU device
@ -59,7 +59,7 @@ For more details, see the [OpenCL on Linux](https://github.com/bashbaug/OpenCLPa
## 7. If you are using dGPU with XMX, ensure that HW_MATMUL feature is recognized
OpenVINO contains *hello_query_device* sample application: [link](https://docs.openvino.ai/2023.0/openvino_inference_engine_ie_bridges_python_sample_hello_query_device_README.html)
OpenVINO contains *hello_query_device* sample application: [link](https://docs.openvino.ai/2023.2/openvino_inference_engine_ie_bridges_python_sample_hello_query_device_README.html)
With this option, you can check whether Intel XMX(Xe Matrix Extension) feature is properly recognized or not. This is a hardware feature to accelerate matrix operations and available on some discrete GPUs.

View File

@ -5,7 +5,7 @@ but at some point clDNN became a part of OpenVINO, so now it's a part of overall
via embedding of [oneDNN library](https://github.com/oneapi-src/oneDNN)
OpenVINO GPU plugin is responsible for:
1. [IE Plugin API](https://docs.openvino.ai/2023.0/openvino_docs_ie_plugin_dg_overview.html) implementation.
1. [IE Plugin API](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html) implementation.
2. Translation of a model from common IE semantic (`ov::Function`) into plugin-specific one (`cldnn::topology`), which is then compiled into
GPU graph representation (`cldnn::network`).
3. Implementation of OpenVINO operation set for Intel® GPU.

View File

@ -47,5 +47,5 @@ After the creation the proxy plugin has next properties:
* [OpenVINO Core Components](../../README.md)
* [OpenVINO Plugins](../README.md)
* [Developer documentation](../../../docs/dev/index.md)
* [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.1/openvino_docs_ie_plugin_dg_overview.html)
* [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html)

View File

@ -35,11 +35,11 @@ $ make -j8
## Tutorials
* [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.0/openvino_docs_ie_plugin_dg_overview.html)
* [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html)
## See also
* [OpenVINO™ README](../../../README.md)
* [OpenVINO Core Components](../../README.md)
* [OpenVINO Plugins](../README.md)
* [Developer documentation](../../../docs/dev/index.md)
* [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.0/openvino_docs_ie_plugin_dg_overview.html)
* [OpenVINO Plugin Developer Guide](https://docs.openvino.ai/2023.2/openvino_docs_ie_plugin_dg_overview.html)

View File

@ -12,14 +12,14 @@ and run on CPU with the OpenVINO&trade;.
Figure below shows the optimization workflow:
![](docs/images/workflow_simple.svg)
To get started with POT tool refer to the corresponding OpenVINO&trade; [documentation](https://docs.openvino.ai/2023.1/openvino_docs_model_optimization_guide.html).
To get started with POT tool refer to the corresponding OpenVINO&trade; [documentation](https://docs.openvino.ai/2023.2/openvino_docs_model_optimization_guide.html).
## Installation
### From PyPI
POT is distributed as a part of OpenVINO&trade; Development Tools package. For installation instruction please refer to this [document](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_install_dev_tools.html).
POT is distributed as a part of OpenVINO&trade; Development Tools package. For installation instruction please refer to this [document](https://docs.openvino.ai/2023.2/openvino_docs_install_guides_install_dev_tools.html).
### From GitHub
As prerequisites, you should install [OpenVINO&trade; Runtime](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_overview.html) and other dependencies such as [Model Optimizer](https://docs.openvino.ai/2023.1/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html) and [Accuracy Checker](https://docs.openvino.ai/2023.1/omz_tools_accuracy_checker.html).
As prerequisites, you should install [OpenVINO&trade; Runtime](https://docs.openvino.ai/2023.2/openvino_docs_install_guides_overview.html) and other dependencies such as [Model Optimizer](https://docs.openvino.ai/2023.2/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html) and [Accuracy Checker](https://docs.openvino.ai/2023.2/omz_tools_accuracy_checker.html).
To install POT from source:
- Clone OpenVINO repository
@ -40,7 +40,7 @@ After installation POT is available as a Python library under `openvino.tools.po
OpenVINO provides several examples to demonstrate the POT optimization workflow:
* Command-line example:
* [Quantization of Image Classification model](https://docs.openvino.ai/2023.1/pot_configs_examples_README.html)
* [Quantization of Image Classification model](https://docs.openvino.ai/2023.2/pot_configs_examples_README.html)
* API tutorials:
* [Quantization of Image Classification model](https://github.com/openvinotoolkit/openvino_notebooks/tree/main/notebooks/301-tensorflow-training-openvino)
* [Quantization of Object Detection model from Model Zoo](https://github.com/openvinotoolkit/openvino_notebooks/tree/main/notebooks/111-yolov5-quantization-migration)
@ -55,4 +55,4 @@ OpenVINO provides several examples to demonstrate the POT optimization workflow:
## See Also
* [Performance Benchmarks](https://docs.openvino.ai/2023.1/openvino_docs_performance_benchmarks.html)
* [Performance Benchmarks](https://docs.openvino.ai/2023.2/openvino_docs_performance_benchmarks.html)