diff --git a/docs/Documentation/model_introduction.md b/docs/Documentation/model_introduction.md
index 26038599b83..c4de8fb4820 100644
--- a/docs/Documentation/model_introduction.md
+++ b/docs/Documentation/model_introduction.md
@@ -13,7 +13,6 @@
Supported_Model_Formats
openvino_docs_OV_Converter_UG_Conversion_Options
openvino_docs_OV_Converter_UG_prepare_model_convert_model_Converting_Model
- openvino_docs_OV_Converter_UG_prepare_model_convert_model_MO_OVC_transition
Every deep learning workflow begins with obtaining a model. You can choose to prepare a custom one, use a ready-made solution and adjust it to your needs, or even download and run a pre-trained network from an online database, such as `TensorFlow Hub `__, `Hugging Face `__, or `Torchvision models `__.
@@ -21,16 +20,18 @@ Every deep learning workflow begins with obtaining a model. You can choose to pr
OpenVINO™ :doc:`supports several model formats ` and can convert them into its own representation, `openvino.Model `__ (`ov.Model `__), providing a conversion API. Converted models can be used for inference with one or multiple OpenVINO Hardware plugins. There are two ways to use the conversion API: using a Python program or calling the ``ovc`` command line tool.
.. note::
+
+ Prior to OpenVINO 2023.1, model conversion API was exposed as the ``openvino.tools.mo.convert_model``
+ function and the ``mo`` command line tool. Now, a new and simplified API is used: the
+ ``openvino.convert_model`` function and the ``ovc`` command line tool.
- Prior OpenVINO 2023.1 release, model conversion API was exposed as ``openvino.tools.mo.convert_model`` function and ``mo`` command line tool.
- Starting from 2023.1 release, a new simplified API was introduced: ``openvino.convert_model`` function and ``ovc`` command line tool as a replacement for ``openvino.tools.mo.convert_model``
- and ``mo`` correspondingly, which are considered to be legacy now. All new users are recommended to use these new methods instead of the old methods. Please note that the new API and old API do not
- provide the same level of features, that means the new tools are not always backward compatible with the old ones. Please consult with :doc:`Model Conversion API Transition Guide `.
+ All new projects are recommended to use the new tools, keeping in mind that they are not fully
+ backwards compatible. For more details, consult the :doc:`Model Conversion API Transition Guide `.
Convert a Model in Python: ``convert_model``
-############################################
+##############################################
-You can use Model conversion API in Python with the ``openvino.convert_model`` function. This function converts a model from its original framework representation, for example Pytorch or TensorFlow, to the object of type ``openvino.Model``. The resulting ``openvino.Model`` can be inferred in the same application (Python script or Jupyter Notebook) or saved into a file using``openvino.save_model`` for future use. Below, there are examples on how to use the ``openvino.convert_model`` with models from popular public repositories:
+You can use the Model conversion API in Python with the ``openvino.convert_model`` function. This function converts a model from its original framework representation, for example Pytorch or TensorFlow, to the object of type ``openvino.Model``. The resulting ``openvino.Model`` can be inferred in the same application (Python script or Jupyter Notebook) or saved into a file using``openvino.save_model`` for future use. Below, there are examples of how to use the ``openvino.convert_model`` with models from popular public repositories:
.. tab-set::
@@ -188,7 +189,7 @@ Option 2, where ``openvino.compile_model`` is used, provides a convenient way to
Option 1 separates model conversion and model inference into two different applications. This approach is useful for deployment scenarios requiring fewer extra dependencies and faster model loading in the end inference application.
-For example, converting a PyTorch model to OpenVINO usually demands the ``torch`` Python module and Python. This process can take extra time and memory. But, after the converted model is saved as IR with ``openvino.save_model``, it can be loaded in a separate application without requiring the ``torch`` dependency and the time-consuming conversion. The inference application can be written in other languages supported by OpenVINO, for example, in C++, and Python installation is not necessary for it to run.
+For example, converting a PyTorch model to OpenVINO usually demands the ``torch`` Python module and Python. This process can take extra time and memory. But, after the converted model is saved as OpenVINO IR with ``openvino.save_model``, it can be loaded in a separate application without requiring the ``torch`` dependency and the time-consuming conversion. The inference application can be written in other languages supported by OpenVINO, for example, in C++, and Python installation is not necessary for it to run.
Before saving the model to OpenVINO IR, consider applying :doc:`Post-training Optimization ` to enable more efficient inference and smaller model size.
@@ -232,4 +233,10 @@ If you are using legacy conversion API (``mo`` or ``openvino.tools.mo.convert_mo
* :doc:`Transition from legacy mo and ov.tools.mo.convert_model `
* :doc:`Legacy Model Conversion API `
+
+
+
+.. api/ie_python_api/_autosummary/openvino.Model.html is a broken link for some reason - need to investigate python api article generation
+
+
@endsphinxdirective
diff --git a/docs/Documentation/openvino_legacy_features.md b/docs/Documentation/openvino_legacy_features.md
index c034dd31d29..dad31043b8b 100644
--- a/docs/Documentation/openvino_legacy_features.md
+++ b/docs/Documentation/openvino_legacy_features.md
@@ -7,6 +7,7 @@
:hidden:
OpenVINO Development Tools package
+ Model Optimizer / Conversion API
OpenVINO API 2.0 transition
Open Model ZOO
Apache MXNet, Caffe, and Kaldi
@@ -36,16 +37,17 @@ offering.
| :doc:`See how to install Development Tools `
-| **Model Optimizer**
+| **Model Optimizer / Conversion API**
| *New solution:* Direct model support and OpenVINO Converter (OVC)
-| *Old solution:* Model Optimizer discontinuation planned for OpenVINO 2025.0
+| *Old solution:* Legacy Conversion API discontinuation planned for OpenVINO 2025.0
|
-| Model Optimizer's role was largely reduced when all major model frameworks became
- supported directly. For the sole purpose of converting model files explicitly,
- it has been replaced with a more light-weight and efficient solution, the
- OpenVINO Converter (launched with OpenVINO 2023.1).
+| The role of Model Optimizer and later the Conversion API was largely reduced
+ when all major model frameworks became supported directly. For converting model
+ files explicitly, it has been replaced with a more light-weight and efficient
+ solution, the OpenVINO Converter (launched with OpenVINO 2023.1).
-.. :doc:`See how to use OVC ????????>`
+| :doc:`See how to use OVC `
+| :doc:`See how to transition from the legacy solution `
| **Open Model ZOO**
diff --git a/docs/Extensibility_UG/Intro.md b/docs/Extensibility_UG/Intro.md
index 319a415403e..401e2f155e4 100644
--- a/docs/Extensibility_UG/Intro.md
+++ b/docs/Extensibility_UG/Intro.md
@@ -22,11 +22,6 @@
openvino_docs_transformations
OpenVINO Plugin Developer Guide
-.. toctree::
- :maxdepth: 1
- :hidden:
-
- openvino_docs_MO_DG_prepare_model_customize_model_optimizer_Customize_Model_Optimizer
The Intel® Distribution of OpenVINO™ toolkit supports neural-network models trained with various frameworks, including
TensorFlow, PyTorch, ONNX, TensorFlow Lite, and PaddlePaddle (OpenVINO support for Apache MXNet, Caffe, and Kaldi is currently
diff --git a/docs/MO_DG/prepare_model/customize_model_optimizer/Customize_Model_Optimizer.md b/docs/MO_DG/prepare_model/customize_model_optimizer/Customize_Model_Optimizer.md
index b96c23beed1..3d792f6c394 100644
--- a/docs/MO_DG/prepare_model/customize_model_optimizer/Customize_Model_Optimizer.md
+++ b/docs/MO_DG/prepare_model/customize_model_optimizer/Customize_Model_Optimizer.md
@@ -1,4 +1,4 @@
-# [LEGACY] Model Optimizer Extensibility {#openvino_docs_MO_DG_prepare_model_customize_model_optimizer_Customize_Model_Optimizer}
+# Legacy Model Optimizer Extensibility {#openvino_docs_MO_DG_prepare_model_customize_model_optimizer_Customize_Model_Optimizer}
@sphinxdirective
diff --git a/docs/OV_Converter_UG/prepare_model/convert_model/MO_OVC_transition.md b/docs/OV_Converter_UG/prepare_model/convert_model/MO_OVC_transition.md
index e550d515b75..9de12249a34 100644
--- a/docs/OV_Converter_UG/prepare_model/convert_model/MO_OVC_transition.md
+++ b/docs/OV_Converter_UG/prepare_model/convert_model/MO_OVC_transition.md
@@ -10,6 +10,7 @@
:hidden:
openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide
+ openvino_docs_MO_DG_prepare_model_customize_model_optimizer_Customize_Model_Optimizer
In 2023.1 OpenVINO release a new OVC (OpenVINO Model Converter) tool has been introduced with the corresponding Python API: ``openvino.convert_model`` method. ``ovc`` and ``openvino.convert_model`` represent
a lightweight alternative of ``mo`` and ``openvino.tools.mo.convert_model`` which are considered legacy API now. In this article, all the differences between ``mo`` and ``ovc`` are summarized and the transition guide from the legacy API to the new API is provided.
diff --git a/docs/home.rst b/docs/home.rst
index 92079e97058..d8f359e65aa 100644
--- a/docs/home.rst
+++ b/docs/home.rst
@@ -14,7 +14,7 @@ OpenVINO 2023.0
.. container::
:name: ov-homepage-banner
- OpenVINO 2023.0
+ OpenVINO 2023.1
.. raw:: html