[DOCS] release adjustments pass 3 - conversion port to master (#19846)
authored-by: Tatiana Savina <tatiana.savina@intel.com>
This commit is contained in:
parent
53fef5f558
commit
7445a9c77b
@ -17,7 +17,7 @@
|
||||
|
||||
Every deep learning workflow begins with obtaining a model. You can choose to prepare a custom one, use a ready-made solution and adjust it to your needs, or even download and run a pre-trained network from an online database, such as `TensorFlow Hub <https://tfhub.dev/>`__, `Hugging Face <https://huggingface.co/>`__, or `Torchvision models <https://pytorch.org/hub/>`__.
|
||||
|
||||
OpenVINO™ :doc:`supports several model formats <Supported_Model_Formats>` and can convert them into its own representation, `openvino.Model <api/ie_python_api/_autosummary/openvino.Model.html>`__ (`ov.Model <api/ie_python_api/_autosummary/openvino.runtime.Model.html>`__), providing a conversion API. Converted models can be used for inference with one or multiple OpenVINO Hardware plugins. There are two ways to use the conversion API: using a Python program or calling the ``ovc`` command line tool.
|
||||
OpenVINO™ :doc:`supports several model formats <Supported_Model_Formats>` and can convert them into its own representation, `openvino.Model <api/ie_python_api/_autosummary/openvino.Model.html>`__ (`ov.Model <api/ie_python_api/_autosummary/openvino.runtime.Model.html>`__), providing a conversion API. Converted models can be used for inference with one or multiple OpenVINO Hardware plugins. There are two ways to use the conversion API: using a Python script or calling the ``ovc`` command line tool.
|
||||
|
||||
.. note::
|
||||
|
||||
@ -31,7 +31,7 @@ OpenVINO™ :doc:`supports several model formats <Supported_Model_Formats>` and
|
||||
Convert a Model in Python: ``convert_model``
|
||||
##############################################
|
||||
|
||||
You can use the Model conversion API in Python with the ``openvino.convert_model`` function. This function converts a model from its original framework representation, for example Pytorch or TensorFlow, to the object of type ``openvino.Model``. The resulting ``openvino.Model`` can be inferred in the same application (Python script or Jupyter Notebook) or saved into a file using``openvino.save_model`` for future use. Below, there are examples of how to use the ``openvino.convert_model`` with models from popular public repositories:
|
||||
You can use the Model conversion API in Python with the ``openvino.convert_model`` function. This function converts a model from its original framework representation, for example PyTorch or TensorFlow, to the object of type ``openvino.Model``. The resulting ``openvino.Model`` can be inferred in the same application (Python script or Jupyter Notebook) or saved into a file using``openvino.save_model`` for future use. Below, there are examples of how to use the ``openvino.convert_model`` with models from popular public repositories:
|
||||
|
||||
|
||||
.. tab-set::
|
||||
|
@ -57,7 +57,7 @@ Mapping from Framework Operation
|
||||
|
||||
Mapping of custom operation is implemented differently, depending on model format used for import. You may choose one of the following:
|
||||
|
||||
1. If a model is represented in the ONNX (including models exported from Pytorch in ONNX), TensorFlow Lite, PaddlePaddle or TensorFlow formats, then one of the classes from :doc:`Frontend Extension API <openvino_docs_Extensibility_UG_Frontend_Extensions>` should be used. It consists of several classes available in C++ which can be used with the ``--extensions`` option in Model Optimizer or when a model is imported directly to OpenVINO runtime using the ``read_model`` method. Python API is also available for runtime model import.
|
||||
1. If a model is represented in the ONNX (including models exported from PyTorch in ONNX), TensorFlow Lite, PaddlePaddle or TensorFlow formats, then one of the classes from :doc:`Frontend Extension API <openvino_docs_Extensibility_UG_Frontend_Extensions>` should be used. It consists of several classes available in C++ which can be used with the ``--extensions`` option in Model Optimizer or when a model is imported directly to OpenVINO runtime using the ``read_model`` method. Python API is also available for runtime model import.
|
||||
|
||||
2. If a model is represented in the Caffe, Kaldi or MXNet formats (as legacy frontends), then :doc:`[Legacy] Model Optimizer Extensions <openvino_docs_MO_DG_prepare_model_customize_model_optimizer_Customize_Model_Optimizer>` should be used. This approach is available for model conversion in Model Optimizer only.
|
||||
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
Model conversion API is represented by ``convert_model()`` method in openvino.tools.mo namespace. ``convert_model()`` is compatible with types from openvino.runtime, like PartialShape, Layout, Type, etc.
|
||||
|
||||
``convert_model()`` has the ability available from the command-line tool, plus the ability to pass Python model objects, such as a Pytorch model or TensorFlow Keras model directly, without saving them into files and without leaving the training environment (Jupyter Notebook or training scripts). In addition to input models consumed directly from Python, ``convert_model`` can take OpenVINO extension objects constructed directly in Python for easier conversion of operations that are not supported in OpenVINO.
|
||||
``convert_model()`` has the ability available from the command-line tool, plus the ability to pass Python model objects, such as a PyTorch model or TensorFlow Keras model directly, without saving them into files and without leaving the training environment (Jupyter Notebook or training scripts). In addition to input models consumed directly from Python, ``convert_model`` can take OpenVINO extension objects constructed directly in Python for easier conversion of operations that are not supported in OpenVINO.
|
||||
|
||||
.. note::
|
||||
|
||||
|
@ -6,19 +6,11 @@
|
||||
:description: Learn how to convert a model from the
|
||||
ONNX format to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
Introduction to ONNX
|
||||
####################
|
||||
|
||||
`ONNX <https://github.com/onnx/onnx>`__ is a representation format for deep learning models that allows AI developers to easily transfer models between different frameworks. It is hugely popular among deep learning tools, like PyTorch, Caffe2, Apache MXNet, Microsoft Cognitive Toolkit, and many others.
|
||||
|
||||
.. note:: ONNX models are supported via FrontEnd API. You may skip conversion to IR and read models directly by OpenVINO runtime API. Refer to the :doc:`inference example <openvino_docs_OV_UG_Integrate_OV_with_your_application>` for more details. Using ``convert_model`` is still necessary in more complex cases, such as new custom inputs/outputs in model pruning, adding pre-processing, or using Python conversion extensions.
|
||||
|
||||
Converting an ONNX Model
|
||||
########################
|
||||
|
||||
This page provides instructions on model conversion from the ONNX format to the OpenVINO IR format.
|
||||
|
||||
The model conversion process assumes you have an ONNX model that was directly downloaded from a public repository or converted from any framework that supports exporting to the ONNX format.
|
||||
|
||||
.. tab-set::
|
||||
|
@ -7,10 +7,10 @@
|
||||
TensorFlow format to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
This page provides general instructions on how to run model conversion from a TensorFlow format to the OpenVINO IR format. The instructions are different depending on whether your model was created with TensorFlow v1.X or TensorFlow v2.X.
|
||||
|
||||
.. note:: TensorFlow models are supported via :doc:`FrontEnd API <openvino_docs_MO_DG_TensorFlow_Frontend>`. You may skip conversion to IR and read models directly by OpenVINO runtime API. Refer to the :doc:`inference example <openvino_docs_OV_UG_Integrate_OV_with_your_application>` for more details. Using ``convert_model`` is still necessary in more complex cases, such as new custom inputs/outputs in model pruning, adding pre-processing, or using Python conversion extensions.
|
||||
|
||||
The conversion instructions are different depending on whether your model was created with TensorFlow v1.X or TensorFlow v2.X.
|
||||
|
||||
Converting TensorFlow 1 Models
|
||||
###############################
|
||||
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
.. meta::
|
||||
:description: Learn how to convert a BERT-NER model
|
||||
from Pytorch to the OpenVINO Intermediate Representation.
|
||||
from PyTorch to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
The goal of this article is to present a step-by-step guide on how to convert PyTorch BERT-NER model to OpenVINO IR. First, you need to download the model and convert it to ONNX.
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
.. meta::
|
||||
:description: Learn how to convert a Cascade RCNN R-101
|
||||
model from Pytorch to the OpenVINO Intermediate Representation.
|
||||
model from PyTorch to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
The goal of this article is to present a step-by-step guide on how to convert a PyTorch Cascade RCNN R-101 model to OpenVINO IR. First, you need to download the model and convert it to ONNX.
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
.. meta::
|
||||
:description: Learn how to convert a F3Net model
|
||||
from Pytorch to the OpenVINO Intermediate Representation.
|
||||
from PyTorch to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
`F3Net <https://github.com/weijun88/F3Net>`__ : Fusion, Feedback and Focus for Salient Object Detection
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
.. meta::
|
||||
:description: Learn how to convert a QuartzNet model
|
||||
from Pytorch to the OpenVINO Intermediate Representation.
|
||||
from PyTorch to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
`NeMo project <https://github.com/NVIDIA/NeMo>`__ provides the QuartzNet model.
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
.. meta::
|
||||
:description: Learn how to convert a RCAN model
|
||||
from Pytorch to the OpenVINO Intermediate Representation.
|
||||
from PyTorch to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
`RCAN <https://github.com/yulunzhang/RCAN>`__ : Image Super-Resolution Using Very Deep Residual Channel Attention Networks
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
.. meta::
|
||||
:description: Learn how to convert a RNN-T model
|
||||
from Pytorch to the OpenVINO Intermediate Representation.
|
||||
from PyTorch to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
This guide covers conversion of RNN-T model from `MLCommons <https://github.com/mlcommons>`__ repository. Follow
|
||||
|
@ -4,7 +4,7 @@
|
||||
|
||||
.. meta::
|
||||
:description: Learn how to convert a YOLACT model
|
||||
from Pytorch to the OpenVINO Intermediate Representation.
|
||||
from PyTorch to the OpenVINO Intermediate Representation.
|
||||
|
||||
|
||||
You Only Look At CoefficienTs (YOLACT) is a simple, fully convolutional model for real-time instance segmentation.
|
||||
|
@ -6,11 +6,8 @@
|
||||
:description: Learn how to convert a model from the
|
||||
PyTorch format to the OpenVINO Model.
|
||||
|
||||
This page provides instructions on how to convert a model from the PyTorch format to the OpenVINO Model using the ``openvino.convert_model`` function.
|
||||
|
||||
.. note::
|
||||
|
||||
In the examples below the ``openvino.save_model`` function is not used because there are no PyTorch-specific details regarding the usage of this function. In all examples, the converted OpenVINO model can be saved to IR by calling ``ov.save_model(ov_model, 'model.xml')`` as usual.
|
||||
To convert a PyTorch model, use the ``openvino.convert_model`` function.
|
||||
|
||||
Here is the simplest example of PyTorch model conversion using a model from ``torchvision``:
|
||||
|
||||
@ -87,6 +84,10 @@ In practice, the code to evaluate or test the PyTorch model is usually provided
|
||||
|
||||
Check out more examples in :doc:`interactive Python tutorials <tutorials>`.
|
||||
|
||||
.. note::
|
||||
|
||||
In the examples above the ``openvino.save_model`` function is not used because there are no PyTorch-specific details regarding the usage of this function. In all examples, the converted OpenVINO model can be saved to IR by calling ``ov.save_model(ov_model, 'model.xml')`` as usual.
|
||||
|
||||
Supported Input Parameter Types
|
||||
###############################
|
||||
|
||||
|
@ -15,7 +15,7 @@
|
||||
|
||||
**OpenVINO IR (Intermediate Representation)** - the proprietary format of OpenVINO™, benefiting from the full extent of its features. The result of running ``ovc`` CLI tool or ``openvino.save_model`` is OpenVINO IR. All other supported formats can be converted to the IR, refer to the following articles for details on conversion:
|
||||
|
||||
* :doc:`How to convert Pytorch <openvino_docs_OV_Converter_UG_prepare_model_convert_model_Convert_Model_From_PyTorch>`
|
||||
* :doc:`How to convert PyTorch <openvino_docs_OV_Converter_UG_prepare_model_convert_model_Convert_Model_From_PyTorch>`
|
||||
* :doc:`How to convert ONNX <openvino_docs_OV_Converter_UG_prepare_model_convert_model_Convert_Model_From_ONNX>`
|
||||
* :doc:`How to convert TensorFlow <openvino_docs_OV_Converter_UG_prepare_model_convert_model_Convert_Model_From_TensorFlow>`
|
||||
* :doc:`How to convert TensorFlow Lite <openvino_docs_OV_Converter_UG_prepare_model_convert_model_Convert_Model_From_TensorFlow_Lite>`
|
||||
|
Loading…
Reference in New Issue
Block a user