Files
openvino/docs/OV_Runtime_UG/ShapeInference.md
Sebastian Golebiewski b1dcb276da Proofreading-OV-Runtime (#11658)
* Update docs/OV_Runtime_UG/protecting_model_guide.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/protecting_model_guide.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/protecting_model_guide.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/protecting_model_guide.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/protecting_model_guide.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/protecting_model_guide.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/optimization_guide/dldt_deployment_optimization_common.md

Co-authored-by: Sebastian Golebiewski <sebastianx.golebiewski@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/Device_Plugins.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GPU_RemoteTensor_API.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/HDDL.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/HDDL.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/HDDL.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/MYRIAD.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/MYRIAD.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/MYRIAD.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/ov_dynamic_shapes.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/config_properties.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/config_properties.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/preprocessing_details.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/preprocessing_details.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/preprocessing_details.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/preprocessing_details.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/performance_hints.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/deployment/deployment-manager-tool.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Apply suggestions from code review

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/preprocessing_details.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/performance_hints.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/preprocessing_details.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/performance_hints.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Update docs/OV_Runtime_UG/deployment/deployment-manager-tool.md

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Apply suggestions from code review

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Apply suggestions from code review

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review

* Update ref links

* Update Getting_performance_numbers.md

* Update deployment_intro.md

* Update preprocessing_details.md

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update tools/pot/openvino/tools/pot/algorithms/quantization/default/README.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/automatic_batching.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/deployment/deployment-manager-tool.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update tools/pot/openvino/tools/pot/algorithms/quantization/default/README.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update automatic_batching.md

* Update docs/OV_Runtime_UG/automatic_batching.md

* Update docs/OV_Runtime_UG/ShapeInference.md

* Update deployment-manager-tool.md

* Update deployment-manager-tool.md

* Update docs/OV_Runtime_UG/deployment/deployment-manager-tool.md

* Update automatic_batching.md

* Update automatic_batching.md

* Update docs/OV_Runtime_UG/ShapeInference.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update integrate_with_your_application.md

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/model_representation.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/model_representation.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update integrate_with_your_application.md

* Update docs/OV_Runtime_UG/integrate_with_your_application.md

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update model_representation.md

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update integrate_with_your_application.md

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update Additional_Optimizations.md

Removing redundant information.

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update Additional_Optimizations.md

* Apply suggestions from code review

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update Additional_Optimizations.md

* Update docs/OV_Runtime_UG/model_representation.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/OV_Runtime_UG/layout_overview.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update model_representation.md

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/docs/SaturationIssue.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/openvino/tools/pot/algorithms/quantization/accuracy_aware/README.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/docs/SaturationIssue.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/docs/SaturationIssue.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/docs/SaturationIssue.md

* Update tools/pot/docs/SaturationIssue.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update README.md

* Update README.md

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/docs/Introduction.md

* Update tools/pot/docs/AccuracyAwareQuantizationUsage.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Removing one-liners

Removing introductory sentences from 'Supported Features' sections.

* Update docs/OV_Runtime_UG/openvino_intro.md

Co-authored-by: Yuan Xu <yuan1.xu@intel.com>

* Update docs/benchmarks/performance_benchmarks_ovms.md

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/docs/Introduction.md

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

* Update tools/pot/docs/DefaultQuantizationUsage.md

* Update tools/pot/docs/BestPractices.md

* Update tools/pot/docs/BestPractices.md

* Update tools/pot/docs/AccuracyAwareQuantizationUsage.md

* Update docs/optimization_guide/model_optimization_guide.md

* Update docs/optimization_guide/dldt_deployment_optimization_guide.md

* Update docs/OV_Runtime_UG/supported_plugins/config_properties.md

* Update docs/OV_Runtime_UG/supported_plugins/GNA.md

* Update docs/OV_Runtime_UG/supported_plugins/CPU.md

* Update docs/OV_Runtime_UG/preprocessing_usecase_save.md

* Apply suggestions from code review

Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>

Co-authored-by: Maciej Smyk <maciejx.smyk@intel.com>
Co-authored-by: Yuan Xu <yuan1.xu@intel.com>
Co-authored-by: Karol Blaszczak <karol.blaszczak@intel.com>
Co-authored-by: msmykx <101244365+msmykx-intel@users.noreply.github.com>
Co-authored-by: Piotr Milewski <piotr.milewski@intel.com>
2022-07-08 13:34:45 +02:00

16 KiB

Changing Input Shapes

@sphinxdirective .. raw:: html

<div id="switcher-cpp" class="switcher-anchor">C++</div>

@endsphinxdirective

OpenVINO™ provides capabilities to change model input shape during the runtime. It may be useful when you want to feed model an input that has different size than model input shape. If you need to do this only once, prepare a model with updated shapes via Model Optimizer. See [Specifying --input_shape Command-line Parameter](@ref when_to_specify_input_shapes) for more information. For all the other cases, follow the instructions below.

Setting a New Input Shape with Reshape Method

The ov::Model::reshape method updates input shapes and propagates them down to the outputs of the model through all intermediate layers. For example, changing the batch size and spatial dimensions of input of a model with an image input:

shape_inference_explained

Consider the code below to achieve that:

@snippet snippets/ShapeInference.cpp picture_snippet

Setting a New Batch Size with set_batch Method

The meaning of the model batch may vary depending on the model design. In order to change the batch dimension of the model, [set the ov::Layout](@ref declare_model_s_layout) and call the ov::set_batch method.

@snippet snippets/ShapeInference.cpp set_batch

The ov::set_batch method is a high level API of the ov::Model::reshape functionality, so all information about the ov::Model::reshape method implications are applicable for ov::set_batch too, including the troubleshooting section.

Once the input shape of ov::Model is set, call the ov::Core::compile_model method to get an ov::CompiledModel object for inference with updated shapes.

There are other approaches to change model input shapes during the stage of [IR generation](@ref when_to_specify_input_shapes) or ov::Model creation.

Dynamic Shape Notice

Shape-changing functionality could be used to turn dynamic model input into a static one and vice versa. It is recommended to always set static shapes when the shape of data is not going to change from one inference to another. Setting static shapes can avoid possible functional limitations, memory, and runtime overheads for dynamic shapes which may vary depending on hardware plugin and model used. To learn more about dynamic shapes in OpenVINO, see the Dynamic Shapes page.

Usage of the Reshape Method

The primary method of the feature is ov::Model::reshape. It is overloaded to better serve two main use cases:

  1. To change the input shape of the model with a single input, you may pass a new shape to the method. See the example of adjusting spatial dimensions to the input image below:

    @snippet snippets/ShapeInference.cpp spatial_reshape

To do the opposite - resize input image to the input shapes of the model, use the pre-processing API.

  1. Otherwise, you can express reshape plan via mapping of input and its new shape:
  • map<ov::Output<ov::Node>, ov::PartialShape specifies input by passing actual input port
  • map<size_t, ov::PartialShape> specifies input by its index
  • map<string, ov::PartialShape> specifies input by its name

@sphinxdirective

.. tab:: Port

.. doxygensnippet:: docs/snippets/ShapeInference.cpp
   :language: cpp
   :fragment: [obj_to_shape]

.. tab:: Index

.. doxygensnippet:: docs/snippets/ShapeInference.cpp
   :language: cpp
   :fragment: [idx_to_shape]

.. tab:: Tensor Name

.. doxygensnippet:: docs/snippets/ShapeInference.cpp
   :language: cpp
   :fragment: [name_to_shape]

@endsphinxdirective

The usage scenarios of the reshape feature can be found in OpenVINO Samples, starting with the Hello Reshape Sample.

In practice, some models are not ready to be reshaped. In such cases, a new input shape cannot be set with Model Optimizer or the ov::Model::reshape method.

@anchor troubleshooting_reshape_errors

Troubleshooting Reshape Errors

Operation semantics may impose restrictions on input shapes of the operation. Shape collision during shape propagation may be a sign that a new shape does not satisfy the restrictions. Changing the model input shape may result in intermediate operations shape collision.

Examples of such operations:

  • The Reshape operation with a hard-coded output shape value.
  • The MatMul operation with the Const second input and this input cannot be resized by spatial dimensions due to operation semantics.

Model structure and logic should not change significantly after model reshaping.

  • The Global Pooling operation is commonly used to reduce output feature map of classification models output. Having the input of the shape [N, C, H, W], Global Pooling returns the output of the shape [N, C, 1, 1]. Model architects usually express Global Pooling with the help of the Pooling operation with the fixed kernel size [H, W]. During spatial reshape, having the input of the shape [N, C, H1, W1], Pooling with the fixed kernel size [H, W] returns the output of the shape [N, C, H2, W2], where H2 and W2 are commonly not equal to 1. It breaks the classification model structure. For example, the publicly available Inception family models from TensorFlow have this issue.

  • Changing the model input shape may significantly affect its accuracy. For example, Object Detection models from TensorFlow have resizing restrictions by design. To keep the model valid after the reshape, choose a new input shape that satisfies conditions listed in the pipeline.config file. For details, refer to the [Tensorflow Object Detection API models resizing techniques](@ref custom-input-shape).

@anchor how-to-fix-non-reshape-able-model

How To Fix Non-Reshape-able Model

Some operators which prevent normal shape propagation can be fixed. To do so you can:

  • see if the issue can be fixed via changing the values of some operators' input. For example, the most common problem of non-reshape-able models is a Reshape operator with hard-coded output shape. You can cut-off hard-coded 2nd input of Reshape and fill it in with relaxed values. For the following example on the picture, the Model Optimizer CLI should be:
mo --input_model path/to/model --input data[8,3,224,224],1:reshaped[2]->[0 -1]`

With 1:reshaped[2], it's requested to cut the 2nd input (counting from zero, so 1: means the 2nd input) of the operation named reshaped and replace it with a Parameter with shape [2]. With ->[0 -1], this new Parameter is replaced by a Constant operator which has the [0, -1] value. Since the Reshape operator has 0 and -1 as specific values (see the meaning in this specification), it allows propagating shapes freely without losing the intended meaning of Reshape.

batch_relaxed

  • transform the model during Model Optimizer conversion on the back phase. For more information, see the Model Optimizer extension.
  • transform OpenVINO Model during the runtime. For more information, see OpenVINO Runtime Transformations.
  • modify the original model with the help of the original framework.

Extensibility

OpenVINO provides a special mechanism that allows adding support of shape inference for custom operations. This mechanism is described in the Extensibility documentation

Introduction (Python)

@sphinxdirective .. raw:: html

<div id="switcher-python" class="switcher-anchor">Python</div>

@endsphinxdirective

OpenVINO™ provides capabilities to change model input shape during the runtime. It may be useful when you want to feed model an input that has different size than model input shape. If you need to do this only once, prepare a model with updated shapes via Model Optimizer. See [specifying input shapes](@ref when_to_specify_input_shapes) for more information. For all the other cases, follow the instructions below.

Setting a New Input Shape with Reshape Method

The Model.reshape method updates input shapes and propagates them down to the outputs of the model through all intermediate layers. Example: Changing the batch size and spatial dimensions of input of a model with an image input:

shape_inference_explained

Consider the code below to achieve that:

@sphinxdirective

.. doxygensnippet:: docs/snippets/ShapeInference.py :language: python :fragment: [picture_snippet]

@endsphinxdirective

Setting a New Batch Size with the set_batch Method

The meaning of the model batch may vary depending on the model design. In order to change the batch dimension of the model, [set the layout](@ref declare_model_s_layout) for inputs and call the set_batch method.

@sphinxdirective

.. doxygensnippet:: docs/snippets/ShapeInference.py :language: python :fragment: [set_batch]

@endsphinxdirective

set_batch method is a high level API of Model.reshape functionality, so all information about Model.reshape method implications are applicable for set_batch too, including the troubleshooting section.

Once the input shape of Model is set, call the compile_model method to get a CompiledModel object for inference with updated shapes.

There are other approaches to change model input shapes during the stage of [IR generation](@ref when_to_specify_input_shapes) or Model creation.

Dynamic Shape Notice

Shape-changing functionality could be used to turn dynamic model input into a static one and vice versa. It is recommended to always set static shapes when the shape of data is not going to change from one inference to another. Setting static shapes can avoid possible functional limitations, memory, and runtime overheads for dynamic shapes which may vary depending on hardware plugin and used model. To learn more about dynamic shapes in OpenVINO, see the Dynamic Shapes article.

Usage of the Reshape Method

The primary method of the feature is Model.reshape. It is overloaded to better serve two main use cases:

  1. To change the input shape of a model with a single input, you may pass a new shape to the method. See the example of adjusting spatial dimensions to the input image:

@sphinxdirective

.. doxygensnippet:: docs/snippets/ShapeInference.py :language: python :fragment: [simple_spatials_change]

@endsphinxdirective

To do the opposite - resize input image to the input shapes of the model, use the pre-processing API.

  1. Otherwise, you can express reshape plan via dictionary mapping input and its new shape: Dictionary keys could be:
  • The str key specifies input by its name.
  • The int key specifies input by its index.
  • The openvino.runtime.Output key specifies input by passing the actual input object.

Dictionary values (representing new shapes) could be:

  • list
  • tuple
  • PartialShape

@sphinxdirective

.. tab:: Port

.. doxygensnippet:: docs/snippets/ShapeInference.py
   :language: python
   :fragment: [obj_to_shape]

.. tab:: Index

.. doxygensnippet:: docs/snippets/ShapeInference.py
   :language: python
   :fragment: [idx_to_shape]

.. tab:: Tensor Name

.. doxygensnippet:: docs/snippets/ShapeInference.py
   :language: python
   :fragment: [name_to_shape]

@endsphinxdirective

The usage scenarios of the reshape feature can be found in OpenVINO Samples, starting with the Hello Reshape Sample.

In practice, some models are not ready to be reshaped. In such cases, a new input shape cannot be set with Model Optimizer or the Model.reshape method.

Troubleshooting Reshape Errors

Operation semantics may impose restrictions on input shapes of the operation. Shape collision during shape propagation may be a sign that a new shape does not satisfy the restrictions. Changing the model input shape may result in intermediate operations shape collision.

Examples of such operations:

  • Reshape operation with a hard-coded output shape value
  • MatMul operation with the Const second input cannot be resized by spatial dimensions due to operation semantics

Model structure and logic should not change significantly after model reshaping.

  • The Global Pooling operation is commonly used to reduce output feature map of classification models output. Having the input of the shape [N, C, H, W], Global Pooling returns the output of the shape [N, C, 1, 1]. Model architects usually express Global Pooling with the help of the Pooling operation with the fixed kernel size [H, W]. During spatial reshape, having the input of the shape [N, C, H1, W1], Pooling with the fixed kernel size [H, W] returns the output of the shape [N, C, H2, W2], where H2 and W2 are commonly not equal to 1. It breaks the classification model structure. For example, the publicly available Inception family models from TensorFlow have this issue.

  • Changing the model input shape may significantly affect its accuracy. For example, Object Detection models from TensorFlow have resizing restrictions by design. To keep the model valid after the reshape, choose a new input shape that satisfies conditions listed in the pipeline.config file. For details, refer to the [Tensorflow Object Detection API models resizing techniques](@ref custom-input-shape).

How To Fix Non-Reshape-able Model

Some operators which prevent normal shape propagation can be fixed. To do so you can:

  • see if the issue can be fixed via changing the values of some operators input. For example, the most common problem of non-reshape-able models is a Reshape operator with hard-coded output shape. You can cut-off hard-coded 2nd input of Reshape and fill it in with relaxed values. For the following example on the picture Model Optimizer CLI should be:
mo --input_model path/to/model --input data[8,3,224,224],1:reshaped[2]->[0 -1]`

With 1:reshaped[2], it's requested to cut the 2nd input (counting from zero, so 1: means the 2nd input) of the operation named reshaped and replace it with a Parameter with shape [2]. With ->[0 -1], this new Parameter is replaced by a Constant operator which has value [0, -1]. Since the Reshape operator has 0 and -1 as specific values (see the meaning in this specification), it allows propagating shapes freely without losing the intended meaning of Reshape.

batch_relaxed

Extensibility

OpenVINO provides a special mechanism that allows adding support of shape inference for custom operations. This mechanism is described in the Extensibility documentation