From dbab89f04706c597c6a08ab68e91ae1ec2eaaeb2 Mon Sep 17 00:00:00 2001 From: Karol Blaszczak Date: Mon, 18 Sep 2023 16:24:26 +0200 Subject: [PATCH] [DOCS] minor post release tweaks (#19914) --- docs/Documentation/model_introduction.md | 37 +++++++++++++------ .../Convert_Model_From_PyTorch.md | 2 +- docs/documentation.md | 4 +- docs/home.rst | 4 +- .../installing-openvino-linux-header.md | 13 ++++--- .../installing-openvino-macos-header.md | 20 +++++----- .../installing-openvino-windows-header.md | 7 ++-- 7 files changed, 53 insertions(+), 34 deletions(-) diff --git a/docs/Documentation/model_introduction.md b/docs/Documentation/model_introduction.md index ad38d118a01..cd24fccbec8 100644 --- a/docs/Documentation/model_introduction.md +++ b/docs/Documentation/model_introduction.md @@ -15,18 +15,37 @@ openvino_docs_OV_Converter_UG_prepare_model_convert_model_Converting_Model -Every deep learning workflow begins with obtaining a model. You can choose to prepare a custom one, use a ready-made solution and adjust it to your needs, or even download and run a pre-trained network from an online database, such as `TensorFlow Hub `__, `Hugging Face `__, or `Torchvision models `__. +Every deep learning workflow begins with obtaining a model. You can choose to prepare +a custom one, use a ready-made solution and adjust it to your needs, or even download +and run a pre-trained network from an online database, such as +`TensorFlow Hub `__, `Hugging Face `__, +or `Torchvision models `__. -OpenVINO™ :doc:`supports several model formats ` and can convert them into its own representation, `openvino.Model `__ (`ov.Model `__), providing a conversion API. Converted models can be used for inference with one or multiple OpenVINO Hardware plugins. There are two ways to use the conversion API: using a Python script or calling the ``ovc`` command line tool. +If your selected model is in one of the :doc:`OpenVINO™ supported model formats `, +you can use it directly, without the need to save as the OpenVINO IR. +(`openvino.Model `__ - +`ov.Model `__). +For this purpose, you can use ``openvino.Core.read_model`` and ``openvino.Core.compile_model`` +methods, so that conversion is performed automatically before inference, for +maximum convenience (note that working with PyTorch differs slightly, the Python API +being the only option, while TensorFlow may present additional considerations +:doc:`TensorFlow Frontend Capabilities and Limitations `). + + +For better performance and more optimization options, OpenVINO offers a conversion +API with two possible approaches: the Python API functions (``openvino.convert_model`` +and ``openvino.save_model``) and the ``ovc`` command line tool, which are described in detail in this article. .. note:: - Prior to OpenVINO 2023.1, model conversion API was exposed as the ``openvino.tools.mo.convert_model`` - function and the ``mo`` command line tool. Now, a new and simplified API is used: the - ``openvino.convert_model`` function and the ``ovc`` command line tool. + Model conversion API prior to OpenVINO 2023.1 is considered deprecated. + Both existing and new projects are recommended to transition to the new + solutions, keeping in mind that they are not fully backwards compatible + with ``openvino.tools.mo.convert_model`` or the ``mo`` CLI tool. + For more details, see the :doc:`Model Conversion API Transition Guide `. + + - All new projects are recommended to use the new tools, keeping in mind that they are not fully - backwards compatible. For more details, consult the :doc:`Model Conversion API Transition Guide `. Convert a Model in Python: ``convert_model`` ############################################## @@ -209,12 +228,8 @@ Another option for model conversion is to use ``ovc`` command-line tool, which s The results of both ``ovc`` and ``openvino.convert_model``/``openvino.save_model`` conversion methods are the same. You can choose either of them based on your convenience. Note that there should not be any differences in the results of model conversion if the same set of parameters is used and the model is saved into OpenVINO IR. -Cases when Model Preparation is not Required -############################################ -If a model is represented as a single file from ONNX, PaddlePaddle, TensorFlow and TensorFlow Lite (check :doc:`TensorFlow Frontend Capabilities and Limitations `), it does not require a separate conversion and IR-saving step, that is ``openvino.convert_model`` and ``openvino.save_model``, or ``ovc``. -OpenVINO provides C++ and Python APIs for reading such models by just calling the ``openvino.Core.read_model`` or ``openvino.Core.compile_model`` methods. These methods perform conversion of the model from the original representation. While this conversion may take extra time compared to using prepared OpenVINO IR, it is convenient when you need to read a model in the original format in C++, since ``openvino.convert_model`` is only available in Python. However, for efficient model deployment with the OpenVINO Runtime, it is still recommended to prepare OpenVINO IR and then use it in your inference application. Additional Resources #################### diff --git a/docs/MO_DG/prepare_model/convert_model/Convert_Model_From_PyTorch.md b/docs/MO_DG/prepare_model/convert_model/Convert_Model_From_PyTorch.md index 055e94049a7..0cafd306653 100644 --- a/docs/MO_DG/prepare_model/convert_model/Convert_Model_From_PyTorch.md +++ b/docs/MO_DG/prepare_model/convert_model/Convert_Model_From_PyTorch.md @@ -58,7 +58,7 @@ parameter to be set, for example: Sometimes ``convert_model`` will produce inputs of the model with dynamic rank or dynamic type. Such model may not be supported by the hardware chosen for inference. To avoid this issue, -use the ``input`` argument of ``convert_model``. For more information, refer to `Convert Models Represented as Python Objects `. +use the ``input`` argument of ``convert_model``. For more information, refer to :doc:`Convert Models Represented as Python Objects `. .. important:: diff --git a/docs/documentation.md b/docs/documentation.md index a25e784165b..276e4e6e093 100644 --- a/docs/documentation.md +++ b/docs/documentation.md @@ -12,9 +12,9 @@ :hidden: API Reference - OpenVINO IR format and Operation Sets + OpenVINO IR format and Operation Sets + Legacy Features Tool Ecosystem - Legacy Features OpenVINO Extensibility Media Processing and CV Libraries OpenVINO™ Security diff --git a/docs/home.rst b/docs/home.rst index d8f359e65aa..4ed32d3aea2 100644 --- a/docs/home.rst +++ b/docs/home.rst @@ -24,10 +24,10 @@ OpenVINO 2023.0
  • An open-source toolkit for optimizing and deploying deep learning models.
    Boost your AI deep-learning inference performance!
  • -
  • Use PyTorch models directly, without converting them first. +
  • Use PyTorch models directly, without converting them first.
    Learn more...
  • -
  • OpenVINO via PyTorch 2.0 torch.compile()
    Use OpenVINO directly in PyTorch-native applications! +
  • OpenVINO via PyTorch 2.0 torch.compile()
    Use OpenVINO directly in PyTorch-native applications!
    Learn more...
  • Do you like Generative AI? You will love how it performs with OpenVINO!
    diff --git a/docs/install_guides/installing-openvino-linux-header.md b/docs/install_guides/installing-openvino-linux-header.md index f0bb87d87f0..a45b11d20e2 100644 --- a/docs/install_guides/installing-openvino-linux-header.md +++ b/docs/install_guides/installing-openvino-linux-header.md @@ -22,14 +22,15 @@ Use Docker -If you want to install OpenVINO™ Runtime on your Linux machine, these are your options: +If you want to install OpenVINO™ Runtime on Linux, you have the following options: -* :doc:`Install OpenVINO Runtime using an Archive File ` +* :doc:`Install OpenVINO using an Archive File ` * :doc:`Install OpenVINO using PyPI ` -* :doc:`Install OpenVINO Runtime using APT ` -* :doc:`Install OpenVINO Runtime using YUM ` -* :doc:`Install OpenVINO Runtime using Conda Forge ` -* :doc:`Install OpenVINO Runtime using Homebrew ` +* :doc:`Install OpenVINO using APT ` +* :doc:`Install OpenVINO using YUM ` +* :doc:`Install OpenVINO using Conda Forge ` +* :doc:`Install OpenVINO using vcpkg ` +* :doc:`Install OpenVINO using Homebrew ` * :doc:`Install OpenVINO using Docker ` diff --git a/docs/install_guides/installing-openvino-macos-header.md b/docs/install_guides/installing-openvino-macos-header.md index dff827ce9a8..2e0d70b61d0 100644 --- a/docs/install_guides/installing-openvino-macos-header.md +++ b/docs/install_guides/installing-openvino-macos-header.md @@ -12,19 +12,21 @@ :maxdepth: 3 :hidden: - From Archive - Using Homebrew - From PyPI - Using Conda Forge + Use Archive + Use Homebrew + Use PyPI + Use Conda Forge Use vcpkg -If you want to install OpenVINO™ Runtime on macOS, there are a few ways to accomplish this. We prepared following options for you: +If you want to install OpenVINO™ Runtime on macOS, you have the following options: -* :doc:`Install OpenVINO Runtime from an Archive File ` -* :doc:`Install OpenVINO from PyPI ` -* :doc:`Install OpenVINO Runtime using Conda Forge ` -* :doc:`Install OpenVINO Runtime via Homebrew ` + +* :doc:`Install OpenVINO using an Archive File ` +* :doc:`Install OpenVINO using PyPI ` +* :doc:`Install OpenVINO using Conda Forge ` +* :doc:`Install OpenVINO using Homebrew ` +* :doc:`Install OpenVINO using vcpkg ` diff --git a/docs/install_guides/installing-openvino-windows-header.md b/docs/install_guides/installing-openvino-windows-header.md index 3044c2accef..65b1803ec71 100644 --- a/docs/install_guides/installing-openvino-windows-header.md +++ b/docs/install_guides/installing-openvino-windows-header.md @@ -22,9 +22,10 @@ If you want to install OpenVINO™ Runtime on Windows, you have the following options: -* :doc:`Install OpenVINO Runtime from an Archive File ` -* :doc:`Install OpenVINO Runtime using PyPI ` -* :doc:`Install OpenVINO Runtime using Conda Forge ` +* :doc:`Install OpenVINO using an Archive File ` +* :doc:`Install OpenVINO using PyPI ` +* :doc:`Install OpenVINO using Conda Forge ` +* :doc:`Install OpenVINO using vcpkg ` * :doc:`Install OpenVINO using Docker `