[DOCS] minor post release tweaks (#19914)

This commit is contained in:
Karol Blaszczak 2023-09-18 16:24:26 +02:00 committed by GitHub
parent 6df420ed67
commit dbab89f047
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
7 changed files with 53 additions and 34 deletions

View File

@ -15,18 +15,37 @@
openvino_docs_OV_Converter_UG_prepare_model_convert_model_Converting_Model
Every deep learning workflow begins with obtaining a model. You can choose to prepare a custom one, use a ready-made solution and adjust it to your needs, or even download and run a pre-trained network from an online database, such as `TensorFlow Hub <https://tfhub.dev/>`__, `Hugging Face <https://huggingface.co/>`__, or `Torchvision models <https://pytorch.org/hub/>`__.
Every deep learning workflow begins with obtaining a model. You can choose to prepare
a custom one, use a ready-made solution and adjust it to your needs, or even download
and run a pre-trained network from an online database, such as
`TensorFlow Hub <https://tfhub.dev/>`__, `Hugging Face <https://huggingface.co/>`__,
or `Torchvision models <https://pytorch.org/hub/>`__.
OpenVINO™ :doc:`supports several model formats <Supported_Model_Formats>` and can convert them into its own representation, `openvino.Model <api/ie_python_api/_autosummary/openvino.Model.html>`__ (`ov.Model <api/ie_python_api/_autosummary/openvino.runtime.Model.html>`__), providing a conversion API. Converted models can be used for inference with one or multiple OpenVINO Hardware plugins. There are two ways to use the conversion API: using a Python script or calling the ``ovc`` command line tool.
If your selected model is in one of the :doc:`OpenVINO™ supported model formats <Supported_Model_Formats>`,
you can use it directly, without the need to save as the OpenVINO IR.
(`openvino.Model <api/ie_python_api/_autosummary/openvino.Model.html>`__ -
`ov.Model <api/ie_python_api/_autosummary/openvino.runtime.Model.html>`__).
For this purpose, you can use ``openvino.Core.read_model`` and ``openvino.Core.compile_model``
methods, so that conversion is performed automatically before inference, for
maximum convenience (note that working with PyTorch differs slightly, the Python API
being the only option, while TensorFlow may present additional considerations
:doc:`TensorFlow Frontend Capabilities and Limitations <openvino_docs_MO_DG_TensorFlow_Frontend>`).
For better performance and more optimization options, OpenVINO offers a conversion
API with two possible approaches: the Python API functions (``openvino.convert_model``
and ``openvino.save_model``) and the ``ovc`` command line tool, which are described in detail in this article.
.. note::
Prior to OpenVINO 2023.1, model conversion API was exposed as the ``openvino.tools.mo.convert_model``
function and the ``mo`` command line tool. Now, a new and simplified API is used: the
``openvino.convert_model`` function and the ``ovc`` command line tool.
Model conversion API prior to OpenVINO 2023.1 is considered deprecated.
Both existing and new projects are recommended to transition to the new
solutions, keeping in mind that they are not fully backwards compatible
with ``openvino.tools.mo.convert_model`` or the ``mo`` CLI tool.
For more details, see the :doc:`Model Conversion API Transition Guide <openvino_docs_OV_Converter_UG_prepare_model_convert_model_MO_OVC_transition>`.
All new projects are recommended to use the new tools, keeping in mind that they are not fully
backwards compatible. For more details, consult the :doc:`Model Conversion API Transition Guide <openvino_docs_OV_Converter_UG_prepare_model_convert_model_MO_OVC_transition>`.
Convert a Model in Python: ``convert_model``
##############################################
@ -209,12 +228,8 @@ Another option for model conversion is to use ``ovc`` command-line tool, which s
The results of both ``ovc`` and ``openvino.convert_model``/``openvino.save_model`` conversion methods are the same. You can choose either of them based on your convenience. Note that there should not be any differences in the results of model conversion if the same set of parameters is used and the model is saved into OpenVINO IR.
Cases when Model Preparation is not Required
############################################
If a model is represented as a single file from ONNX, PaddlePaddle, TensorFlow and TensorFlow Lite (check :doc:`TensorFlow Frontend Capabilities and Limitations <openvino_docs_MO_DG_TensorFlow_Frontend>`), it does not require a separate conversion and IR-saving step, that is ``openvino.convert_model`` and ``openvino.save_model``, or ``ovc``.
OpenVINO provides C++ and Python APIs for reading such models by just calling the ``openvino.Core.read_model`` or ``openvino.Core.compile_model`` methods. These methods perform conversion of the model from the original representation. While this conversion may take extra time compared to using prepared OpenVINO IR, it is convenient when you need to read a model in the original format in C++, since ``openvino.convert_model`` is only available in Python. However, for efficient model deployment with the OpenVINO Runtime, it is still recommended to prepare OpenVINO IR and then use it in your inference application.
Additional Resources
####################

View File

@ -58,7 +58,7 @@ parameter to be set, for example:
Sometimes ``convert_model`` will produce inputs of the model with dynamic rank or dynamic type.
Such model may not be supported by the hardware chosen for inference. To avoid this issue,
use the ``input`` argument of ``convert_model``. For more information, refer to `Convert Models Represented as Python Objects <openvino_docs_MO_DG_Python_API>`.
use the ``input`` argument of ``convert_model``. For more information, refer to :doc:`Convert Models Represented as Python Objects <openvino_docs_MO_DG_Python_API>`.
.. important::

View File

@ -13,8 +13,8 @@
API Reference <api/api_reference>
OpenVINO IR format and Operation Sets <openvino_ir>
Tool Ecosystem <openvino_ecosystem>
Legacy Features <openvino_legacy_features>
Tool Ecosystem <openvino_ecosystem>
OpenVINO Extensibility <openvino_docs_Extensibility_UG_Intro>
Media Processing and CV Libraries <media_processing_cv_libraries>
OpenVINO™ Security <openvino_docs_security_guide_introduction>

View File

@ -24,10 +24,10 @@ OpenVINO 2023.0
<ul class="splide__list">
<li class="splide__slide">An open-source toolkit for optimizing and deploying deep learning models.<br>Boost your AI deep-learning inference performance!</li>
<li class="splide__slide"Better OpenVINO integration with PyTorch!<br>Use PyTorch models directly, without converting them first.
<li class="splide__slide"Better OpenVINO integration with PyTorch!<br>Use PyTorch models directly, without converting them first.<br>
<a href="https://docs.openvino.ai/2023.1/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_PyTorch.html">Learn more...</a>
</li>
<li class="splide__slide">OpenVINO via PyTorch 2.0 torch.compile()<br>Use OpenVINO directly in PyTorch-native applications!
<li class="splide__slide">OpenVINO via PyTorch 2.0 torch.compile()<br>Use OpenVINO directly in PyTorch-native applications!<br>
<a href="https://docs.openvino.ai/2023.1/pytorch_2_0_torch_compile.html">Learn more...</a>
</li>
<li class="splide__slide">Do you like Generative AI? You will love how it performs with OpenVINO!<br>

View File

@ -22,14 +22,15 @@
Use Docker <openvino_docs_install_guides_installing_openvino_docker>
If you want to install OpenVINO™ Runtime on your Linux machine, these are your options:
If you want to install OpenVINO™ Runtime on Linux, you have the following options:
* :doc:`Install OpenVINO Runtime using an Archive File <openvino_docs_install_guides_installing_openvino_from_archive_linux>`
* :doc:`Install OpenVINO using an Archive File <openvino_docs_install_guides_installing_openvino_from_archive_linux>`
* :doc:`Install OpenVINO using PyPI <openvino_docs_install_guides_installing_openvino_pip>`
* :doc:`Install OpenVINO Runtime using APT <openvino_docs_install_guides_installing_openvino_apt>`
* :doc:`Install OpenVINO Runtime using YUM <openvino_docs_install_guides_installing_openvino_yum>`
* :doc:`Install OpenVINO Runtime using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>`
* :doc:`Install OpenVINO Runtime using Homebrew <openvino_docs_install_guides_installing_openvino_brew>`
* :doc:`Install OpenVINO using APT <openvino_docs_install_guides_installing_openvino_apt>`
* :doc:`Install OpenVINO using YUM <openvino_docs_install_guides_installing_openvino_yum>`
* :doc:`Install OpenVINO using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>`
* :doc:`Install OpenVINO using vcpkg <openvino_docs_install_guides_installing_openvino_vcpkg>`
* :doc:`Install OpenVINO using Homebrew <openvino_docs_install_guides_installing_openvino_brew>`
* :doc:`Install OpenVINO using Docker <openvino_docs_install_guides_installing_openvino_docker>`

View File

@ -12,19 +12,21 @@
:maxdepth: 3
:hidden:
From Archive <openvino_docs_install_guides_installing_openvino_from_archive_macos>
Using Homebrew <openvino_docs_install_guides_installing_openvino_brew>
From PyPI <openvino_docs_install_guides_installing_openvino_pip>
Using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>
Use Archive <openvino_docs_install_guides_installing_openvino_from_archive_macos>
Use Homebrew <openvino_docs_install_guides_installing_openvino_brew>
Use PyPI <openvino_docs_install_guides_installing_openvino_pip>
Use Conda Forge <openvino_docs_install_guides_installing_openvino_conda>
Use vcpkg <openvino_docs_install_guides_installing_openvino_vcpkg>
If you want to install OpenVINO™ Runtime on macOS, there are a few ways to accomplish this. We prepared following options for you:
If you want to install OpenVINO™ Runtime on macOS, you have the following options:
* :doc:`Install OpenVINO Runtime from an Archive File <openvino_docs_install_guides_installing_openvino_from_archive_macos>`
* :doc:`Install OpenVINO from PyPI <openvino_docs_install_guides_installing_openvino_pip>`
* :doc:`Install OpenVINO Runtime using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>`
* :doc:`Install OpenVINO Runtime via Homebrew <openvino_docs_install_guides_installing_openvino_brew>`
* :doc:`Install OpenVINO using an Archive File <openvino_docs_install_guides_installing_openvino_from_archive_macos>`
* :doc:`Install OpenVINO using PyPI <openvino_docs_install_guides_installing_openvino_pip>`
* :doc:`Install OpenVINO using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>`
* :doc:`Install OpenVINO using Homebrew <openvino_docs_install_guides_installing_openvino_brew>`
* :doc:`Install OpenVINO using vcpkg <openvino_docs_install_guides_installing_openvino_vcpkg>`

View File

@ -22,9 +22,10 @@
If you want to install OpenVINO™ Runtime on Windows, you have the following options:
* :doc:`Install OpenVINO Runtime from an Archive File <openvino_docs_install_guides_installing_openvino_from_archive_windows>`
* :doc:`Install OpenVINO Runtime using PyPI <openvino_docs_install_guides_installing_openvino_pip>`
* :doc:`Install OpenVINO Runtime using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>`
* :doc:`Install OpenVINO using an Archive File <openvino_docs_install_guides_installing_openvino_from_archive_windows>`
* :doc:`Install OpenVINO using PyPI <openvino_docs_install_guides_installing_openvino_pip>`
* :doc:`Install OpenVINO using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>`
* :doc:`Install OpenVINO using vcpkg <openvino_docs_install_guides_installing_openvino_vcpkg>`
* :doc:`Install OpenVINO using Docker <openvino_docs_install_guides_installing_openvino_docker>`