[DOCS] feature transition section (#19506)

* [DOCS] legacy features section

* pass 2 of extensions

* Apply suggestions from code review

---------

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>
This commit is contained in:
Karol Blaszczak 2023-09-09 20:30:51 +02:00 committed by GitHub
parent 8eb165021c
commit 932ba63744
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
35 changed files with 287 additions and 184 deletions

View File

@ -14,7 +14,6 @@
Interactive Tutorials (Python) <tutorials>
Sample Applications (Python & C++) <openvino_docs_OV_UG_Samples_Overview>
OpenVINO API 2.0 Transition <openvino_2_0_transition_guide>
This section will help you get a hands-on experience with OpenVINO even if you are just starting

View File

@ -13,7 +13,6 @@
Supported_Model_Formats
openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide
omz_tools_downloader
Every deep learning workflow begins with obtaining a model. You can choose to prepare a custom one, use a ready-made solution and adjust it to your needs, or even download and run a pre-trained network from an online database, such as `TensorFlow Hub <https://tfhub.dev/>`__, `Hugging Face <https://huggingface.co/>`__, or `Torchvision models <https://pytorch.org/hub/>`__.

View File

@ -3,7 +3,7 @@
@sphinxdirective
.. meta::
:description: OpenVINO™ is an ecosystem of utilities that have advanced capabilities, which help develop deep learning solutions.
:description: OpenVINO™ ecosystem offers various resources for developing deep learning solutions.
.. toctree::
@ -13,7 +13,6 @@
ote_documentation
datumaro_documentation
ovsa_get_started
openvino_docs_tuning_utilities
OpenVINO™ is not just one tool. It is an expansive ecosystem of utilities, providing a comprehensive workflow for deep learning solution development. Learn more about each of them to reach the full potential of OpenVINO™ Toolkit.
@ -60,39 +59,6 @@ More resources:
* `GitHub <https://github.com/openvinotoolkit/datumaro>`__
* `Documentation <https://openvinotoolkit.github.io/datumaro/stable/docs/get-started/introduction.html>`__
**Compile Tool**
Compile tool is now deprecated. If you need to compile a model for inference on a specific device, use the following script:
.. tab-set::
.. tab-item:: Python
:sync: py
.. doxygensnippet:: docs/snippets/export_compiled_model.py
:language: python
:fragment: [export_compiled_model]
.. tab-item:: C++
:sync: cpp
.. doxygensnippet:: docs/snippets/export_compiled_model.cpp
:language: cpp
:fragment: [export_compiled_model]
To learn which device supports the import / export functionality, see the :doc:`feature support matrix <openvino_docs_OV_UG_Working_with_devices>`.
For more details on preprocessing steps, refer to the :doc:`Optimize Preprocessing <openvino_docs_OV_UG_Preprocessing_Overview>`. To compile the model with advanced preprocessing capabilities, refer to the :doc:`Use Case - Integrate and Save Preprocessing Steps Into OpenVINO IR <openvino_docs_OV_UG_Preprocess_Usecase_save>`, which shows how to have all the preprocessing in the compiled blob.
**DL Workbench**
A web-based tool for deploying deep learning models. Built on the core of OpenVINO and equipped with a graphics user interface, DL Workbench is a great way to explore the possibilities of the OpenVINO workflow, import, analyze, optimize, and build your pre-trained models. You can do all that by visiting `Intel® Developer Cloud <https://software.intel.com/content/www/us/en/develop/tools/devcloud.html>`__ and launching DL Workbench online.
**OpenVINO™ integration with TensorFlow (OVTF)**
OpenVINO™ Integration with TensorFlow will no longer be supported as of OpenVINO release 2023.0. As part of the 2023.0 release, OpenVINO will feature a significantly enhanced TensorFlow user experience within native OpenVINO without needing offline model conversions. :doc:`Learn more <openvino_docs_MO_DG_TensorFlow_Frontend>`.
@endsphinxdirective

View File

@ -0,0 +1,139 @@
# Legacy Features and Components {#openvino_legacy_features}
@sphinxdirective
.. toctree::
:maxdepth: 1
:hidden:
OpenVINO Development Tools package <openvino_docs_install_guides_install_dev_tools>
OpenVINO API 2.0 transition <openvino_2_0_transition_guide>
Open Model ZOO <model_zoo>
Apache MXNet, Caffe, and Kaldi <mxnet_caffe_kaldi>
Post-training Optimization Tool <pot_introduction>
Since OpenVINO has grown very rapidly in recent years, some of its features
and components have been replaced by other solutions. Some of them are still
supported to assure OpenVINO users are given enough time to adjust their projects,
before the features are fully discontinued.
This section will give you an overview of these major changes and tell you how
you can proceed to get the best experience and results with the current OpenVINO
offering.
| **OpenVINO Development Tools Package**
| *New solution:* OpenVINO Runtime includes all supported components
| *Old solution:* discontinuation planned for OpenVINO 2025.0
|
| OpenVINO Development Tools used to be the OpenVINO package with tools for
advanced operations on models, such as Model conversion API, Benchmark Tool,
Accuracy Checker, Annotation Converter, Post-Training Optimization Tool,
and Open Model Zoo tools. Most of these tools have been either removed,
replaced by other solutions, or moved to the OpenVINO Runtime package.
| :doc:`See how to install Development Tools <openvino_docs_install_guides_install_dev_tools>`
| **Model Optimizer**
| *New solution:* Direct model support and OpenVINO Converter (OVC)
| *Old solution:* Model Optimizer discontinuation planned for OpenVINO 2025.0
|
| Model Optimizer's role was largely reduced when all major model frameworks became
supported directly. For the sole purpose of converting model files explicitly,
it has been replaced with a more light-weight and efficient solution, the
OpenVINO Converter (launched with OpenVINO 2023.1).
.. :doc:`See how to use OVC <?????????>`
| **Open Model ZOO**
| *New solution:* users are encouraged to use public model repositories
| *Old solution:* discontinuation planned for OpenVINO 2024.0
|
| Open Model ZOO provided a collection of models prepared for use with OpenVINO,
and a small set of tools enabling a level of automation for the process.
Since the tools have been mostly replaced by other solutions and several
other model repositories have recently grown in size and popularity,
Open Model ZOO will no longer be maintained. You may still use its resources
until they are fully removed.
| :doc:`See the Open Model ZOO documentation <model_zoo>`
| `Check the OMZ GitHub project <https://github.com/openvinotoolkit/open_model_zoo>`__
| **Apache MXNet, Caffe, and Kaldi model formats**
| *New solution:* conversion to ONNX via external tools
| *Old solution:* model support will be discontinued with OpenVINO 2024.0
|
| Since these three model formats proved to be far less popular among OpenVINO users
than the remaining ones, their support has been discontinued. Converting them to the
ONNX format is a possible way of retaining them in the OpenVINO-based pipeline.
| :doc:`See the previous conversion instructions <mxnet_caffe_kaldi>`
| :doc:`See the currently supported frameworks <Supported_Model_Formats>`
| **Post-training Optimization Tool (POT)**
| *New solution:* NNCF extended in OpenVINO 2023.0
| *Old solution:* POT discontinuation planned for 2024
|
| Neural Network Compression Framework (NNCF) now offers the same functionality as POT,
apart from its original feature set. It is currently the default tool for performing
both, post-training and quantization optimizations, while POT is considered deprecated.
| :doc:`See the deprecated POT documentation <pot_introduction>`
| :doc:`See how to use NNCF for model optimization <openvino_docs_model_optimization_guide>`
| `Check the NNCF GitHub project, including documentation <https://github.com/openvinotoolkit/nncf>`__
| **Old Inference API 1.0**
| *New solution:* API 2.0 launched in OpenVINO 2022.1
| *Old solution:* discontinuation planned for OpenVINO 2024.0
|
| API 1.0 (Inference Engine and nGraph) is now deprecated. It can still be
used but is not recommended. Its discontinuation is planned for 2024.
| :doc:`See how to transition to API 2.0 <openvino_2_0_transition_guide>`
| **Compile tool**
| *New solution:* the tool is no longer needed
| *Old solution:* deprecated in OpenVINO 2023.0
|
| Compile tool is now deprecated. If you need to compile a model for inference on
a specific device, use the following script:
.. tab-set::
.. tab-item:: Python
:sync: py
.. doxygensnippet:: docs/snippets/export_compiled_model.py
:language: python
:fragment: [export_compiled_model]
.. tab-item:: C++
:sync: cpp
.. doxygensnippet:: docs/snippets/export_compiled_model.cpp
:language: cpp
:fragment: [export_compiled_model]
| :doc:`see which devices support import / export <openvino_docs_OV_UG_Working_with_devices>`
| :doc:`Learn more on preprocessing steps <openvino_docs_OV_UG_Preprocessing_Overview>`
| :doc:`See how to integrate and save preprocessing steps into OpenVINO IR <openvino_docs_OV_UG_Preprocess_Usecase_save>`
| **DL Workbench**
| *New solution:* DevCloud version
| *Old solution:* local distribution discontinued in OpenVINO 2022.3
|
| The stand-alone version of DL Workbench, a GUI tool for previewing and benchmarking
deep learning models, has been discontinued. You can use its cloud version:
| `Intel® Developer Cloud for the Edge <https://www.intel.com/content/www/us/en/developer/tools/devcloud/edge/overview.html>`__.
| **OpenVINO™ integration with TensorFlow (OVTF)**
| *New solution:* Direct model support and OpenVINO Converter (OVC)
| *Old solution:* discontinued in OpenVINO 2023.0
|
| OpenVINO™ Integration with TensorFlow is longer supported, as OpenVINO now features a
native TensorFlow support, significantly enhancing user experience with no need for
explicit model conversion.
| :doc:`Learn more <openvino_docs_MO_DG_TensorFlow_Frontend>`
@endsphinxdirective

View File

@ -0,0 +1,31 @@
# MX Net, Caffe, and Kaldi model formats {#mxnet_caffe_kaldi}
@sphinxdirective
.. toctree::
:maxdepth: 1
:hidden:
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_MxNet
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_Caffe
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_Kaldi
openvino_docs_MO_DG_prepare_model_convert_model_mxnet_specific_Convert_GluonCV_Models
openvino_docs_MO_DG_prepare_model_convert_model_mxnet_specific_Convert_Style_Transfer_From_MXNet
openvino_docs_MO_DG_prepare_model_convert_model_kaldi_specific_Aspire_Tdnn_Model
The following articles present the deprecated conversion method for MX Net, Caffe,
and Kaldi model formats.
:doc:`Apache MX Net conversion <openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_MxNet>`
:doc:`Caffe conversion <openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_Caffe>`
:doc:`Kaldi conversion <openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_Kaldi>`
Here are three examples of conversion for particular models.
:doc:`MXNet GluonCV conversion <openvino_docs_MO_DG_prepare_model_convert_model_mxnet_specific_Convert_GluonCV_Models>`
:doc:`MXNet Style Transfer Model conversion <openvino_docs_MO_DG_prepare_model_convert_model_mxnet_specific_Convert_Style_Transfer_From_MXNet>`
:doc:`Kaldi ASpIRE Chain TDNN Model conversion <openvino_docs_MO_DG_prepare_model_convert_model_kaldi_specific_Aspire_Tdnn_Model>`
@endsphinxdirective

View File

@ -31,25 +31,26 @@
openvino_docs_MO_DG_prepare_model_convert_model_pytorch_specific_Convert_RCAN
openvino_docs_MO_DG_prepare_model_convert_model_pytorch_specific_Convert_RNNT
openvino_docs_MO_DG_prepare_model_convert_model_pytorch_specific_Convert_YOLACT
openvino_docs_MO_DG_prepare_model_convert_model_mxnet_specific_Convert_GluonCV_Models
openvino_docs_MO_DG_prepare_model_convert_model_mxnet_specific_Convert_Style_Transfer_From_MXNet
openvino_docs_MO_DG_prepare_model_convert_model_kaldi_specific_Aspire_Tdnn_Model
.. meta::
:description: Get to know conversion methods for specific TensorFlow, ONNX, PyTorch, MXNet, and Kaldi models.
This section provides a set of tutorials that demonstrate conversion methods for specific
TensorFlow, ONNX, PyTorch, MXNet, and Kaldi models, which does not necessarily cover your case.
TensorFlow, ONNX, and PyTorch models. Note that these instructions do not cover all use
cases and may not reflect your particular needs.
Before studying the tutorials, try to convert the model out-of-the-box by specifying only the
``--input_model`` parameter in the command line.
.. warning::
Note that OpenVINO support for Apache MXNet, Caffe, and Kaldi is currently being deprecated and will be removed entirely in the future.
.. note::
Apache MXNet, Caffe, and Kaldi are no longer directly supported by OpenVINO.
They will remain available for some time, so make sure to transition to other
frameworks before they are fully discontinued.
You will find a collection of :doc:`Python tutorials <tutorials>` written for running on Jupyter notebooks
that provide an introduction to the OpenVINO™ toolkit and explain how to use the Python API and tools for
optimized deep learning inference.
@endsphinxdirective
@endsphinxdirective

View File

@ -11,9 +11,6 @@
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_PyTorch
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_TensorFlow_Lite
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_Paddle
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_MxNet
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_Caffe
openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_Kaldi
openvino_docs_MO_DG_prepare_model_convert_model_tutorials
.. meta::

View File

@ -24,7 +24,7 @@ process even simpler.
These instructions are largely deprecated and should be used for versions prior to 2023.1.
OpenVINO Development Tools is being deprecated and will be discontinued entirely in 2025.
The OpenVINO Development Tools package is being deprecated and will be discontinued entirely in 2025.
With this change, the OpenVINO Runtime package has become the default choice for installing the
software. It now includes all components necessary to utilize OpenVINO's functionality.

View File

@ -12,9 +12,10 @@
:hidden:
API Reference <api/api_reference>
OpenVINO IR format and Operation Sets <openvino_ir>
Tool Ecosystem <openvino_ecosystem>
Legacy Features <openvino_legacy_features>
OpenVINO Extensibility <openvino_docs_Extensibility_UG_Intro>
OpenVINO IR format and Operation Sets <openvino_ir>
Media Processing and CV Libraries <media_processing_cv_libraries>
OpenVINO™ Security <openvino_docs_security_guide_introduction>

View File

@ -131,7 +131,6 @@ Feature Overview
LEARN OPENVINO <learn_openvino>
OPENVINO WORKFLOW <openvino_workflow>
DOCUMENTATION <documentation>
MODEL ZOO <model_zoo>
RESOURCES <resources>
RELEASE NOTES <release_notes>

View File

@ -14,7 +14,7 @@ OpenVINO Development Tools is a set of utilities that make it easy to develop an
* Post-Training Optimization Tool
* Model Downloader and other Open Model Zoo tools
The instructions on this page show how to install OpenVINO Development Tools. If you are a Python developer, it only takes a few simple steps to install the tools with PyPI. If you are developing in C++, OpenVINO Runtime must be installed separately before installing OpenVINO Development Tools.
The instructions on this page show how to install OpenVINO Development Tools. If you are a Python developer, it only takes a few simple steps to install the tools with PyPI. If you are developing in C/C++, OpenVINO Runtime must be installed separately before installing OpenVINO Development Tools.
In both cases, Python 3.7 - 3.11 needs to be installed on your machine before starting.
@ -31,10 +31,10 @@ If you are a Python developer, follow the steps in the :ref:`Installing OpenVINO
.. _cpp_developers:
For C++ Developers
##################
For C/C++ Developers
#######################
If you are a C++ developer, you must first install OpenVINO Runtime separately to set up the C++ libraries, sample code, and dependencies for building applications with OpenVINO. These files are not included with the PyPI distribution. See the :doc:`Selector Tool <openvino_docs_install_guides_overview>` page to install OpenVINO Runtime from an archive file for your operating system.
If you are a C/C++ developer, you must first install OpenVINO Runtime separately to set up the C/C++ libraries, sample code, and dependencies for building applications with OpenVINO. These files are not included with the PyPI distribution. See the :doc:`Selector Tool <openvino_docs_install_guides_overview>` page to install OpenVINO Runtime from an archive file for your operating system.
Once OpenVINO Runtime is installed, you may install OpenVINO Development Tools for access to tools like ``mo``, Model Downloader, Benchmark Tool, and other utilities that will help you optimize your model and develop your application. Follow the steps in the :ref:`Installing OpenVINO Development Tools <install_dev_tools>` section on this page to install it.
@ -162,7 +162,7 @@ To verify the package is properly installed, run the command below (this may tak
You will see the help message for ``mo`` if installation finished successfully. If you get an error, refer to the :doc:`Troubleshooting Guide <openvino_docs_get_started_guide_troubleshooting>` for possible solutions.
Congratulations! You finished installing OpenVINO Development Tools with C++ capability. Now you can start exploring OpenVINO's functionality through example C++ applications. See the "What's Next?" section to learn more!
Congratulations! You finished installing OpenVINO Development Tools with C/C++ capability. Now you can start exploring OpenVINO's functionality through example C/C++ applications. See the "What's Next?" section to learn more!
What's Next?
############

View File

@ -72,7 +72,7 @@ Intel® GNA driver for Windows is available through Windows Update.
Whats Next?
####################
Now you are ready to try out OpenVINO™. You can use the following tutorials to write your applications using Python and C++.
Now you are ready to try out OpenVINO™. You can use the following tutorials to write your applications using Python and C/C++.
* Developing in Python:
@ -80,7 +80,7 @@ Now you are ready to try out OpenVINO™. You can use the following tutorials to
* `Start with ONNX and PyTorch models with OpenVINO™ <notebooks/102-pytorch-onnx-to-openvino-with-output.html>`__
* `Start with PaddlePaddle models with OpenVINO™ <notebooks/103-paddle-to-openvino-classification-with-output.html>`__
* Developing in C++:
* Developing in C/C++:
* :doc:`Image Classification Async C++ Sample <openvino_inference_engine_samples_classification_sample_async_README>`
* :doc:`Hello Classification C++ Sample <openvino_inference_engine_samples_hello_classification_README>`

View File

@ -53,7 +53,7 @@ Intel® NPU driver for Windows is available through Windows Update.
Whats Next?
####################
Now you are ready to try out OpenVINO™. You can use the following tutorials to write your applications using Python and C++.
Now you are ready to try out OpenVINO™. You can use the following tutorials to write your applications using Python and C/C++.
* Developing in Python:
@ -61,7 +61,7 @@ Now you are ready to try out OpenVINO™. You can use the following tutorials to
* `Start with ONNX and PyTorch models with OpenVINO™ <notebooks/102-pytorch-onnx-to-openvino-with-output.html>`__
* `Start with PaddlePaddle models with OpenVINO™ <notebooks/103-paddle-to-openvino-classification-with-output.html>`__
* Developing in C++:
* Developing in C/C++:
* :doc:`Image Classification Async C++ Sample <openvino_inference_engine_samples_classification_sample_async_README>`
* :doc:`Hello Classification C++ Sample <openvino_inference_engine_samples_hello_classification_README>`

View File

@ -17,26 +17,34 @@
For GNA <openvino_docs_install_guides_configurations_for_intel_gna>
For certain use cases, you may need to install additional software, to get the full
potential of OpenVINO™. Check the following list for components pertaining to your
workflow:
| **Open Computer Vision Library**
| OpenCV is used to extend the capabilities of some models, for example enhance some of
OpenVINO samples, when used as a dependency in compilation. To install OpenCV for OpenVINO, see the
`instructions on GtHub <https://github.com/opencv/opencv/wiki/BuildOpenCV4OpenVINO>`__.
For certain use cases, you may need to install additional software, to use the full
potential of OpenVINO™. Check the following list for components for elements used in
your workflow:
| **GPU drivers**
| If you want to run inference on a GPU, make sure your GPU's drivers are properly installed.
See the :doc:`guide on GPU configuration <openvino_docs_install_guides_configurations_for_intel_gpu>`
for details.
| **NPU drivers**
| Intel's Neural Processing Unit introduced with the Intel® Core™ Ultra generation of CPUs
(formerly known as Meteor Lake), is a low-power solution for offloading neural network computation.
If you want to run inference on an NPU, make sure your NPU's drivers are properly installed.
See the :doc:`guide on NPU configuration <openvino_docs_install_guides_configurations_for_intel_npu>`
for details.
| **GNA drivers**
| If you want to run inference on a GNA (note that it is currently being deprecated and will no longer
be supported beyond 2023.2), make sure your GPU's drivers are properly installed. See the
:doc:`guide on GNA configuration <openvino_docs_install_guides_configurations_for_intel_gna>`
for details.
| **Open Computer Vision Library**
| OpenCV is used to extend the capabilities of some models, for example enhance some of
OpenVINO samples, when used as a dependency in compilation. To install OpenCV for OpenVINO, see the
`instructions on GtHub <https://github.com/opencv/opencv/wiki/BuildOpenCV4OpenVINO>`__.
@endsphinxdirective

View File

@ -10,7 +10,7 @@
Note that the APT distribution:
* offers both C++ and Python APIs
* offers both C/C++ and Python APIs
* does not offer support for GNA and NPU inference
* additionally includes code samples
* is dedicated to Linux users.

View File

@ -10,9 +10,9 @@
Note that the `Homebrew <https://brew.sh/>`__ distribution:
* offers both C++ and Python APIs
* offers both C/C++ and Python APIs
* does not offer support for GNA and NPU inference
* is dedicated to macOS users.
* is dedicated to macOS and Linux users.
.. tab-set::

View File

@ -11,7 +11,7 @@
Note that the Conda Forge distribution:
* offers both C++ and Python APIs
* offers both C/C++ and Python APIs
* does not offer support for GNA and NPU inference
* is dedicated to users of all major OSs: Windows, Linux, macOS.

View File

@ -11,7 +11,7 @@
Note that the Archive distribution:
* offers both C++ and Python APIs
* offers both C/C++ and Python APIs
* additionally includes code samples
* is dedicated to users of all major OSs: Windows, Linux, macOS
* may offer different hardware support under different operating systems
@ -19,20 +19,17 @@
.. dropdown:: Inference Options
=================== ===== ===== ===== ===== ======== ============= ======== ========
Operating System CPU GPU GNA NPU AUTO Auto-batch HETERO MULTI
=================== ===== ===== ===== ===== ======== ============= ======== ========
Debian9 armhf V n/a n/a n/a V V V n/a
Debian9 arm64 V n/a n/a n/a V V V n/a
CentOS7 x86_64 V V V n/a V V V V
Ubuntu18 x86_64 V V V n/a V V V V
Ubuntu20 x86_64 V V V V V V V V
Ubuntu22 x86_64 V V V V V V V V
RHEL8 x86_64 V V V n/a V V V V
Windows x86_64 V V V V V V V V
MacOS x86_64 V n/a n/a n/a V V V n/a
MacOS arm64 V n/a n/a n/a V V V n/a
=================== ===== ===== ===== ===== ======== ============= ======== ========
=================== ===== ===== ===== =====
Operating System CPU GPU GNA NPU
=================== ===== ===== ===== =====
Debian9 armhf V n/a n/a n/a
Debian9 arm64 V n/a n/a n/a
CentOS7 x86_64 V V n/a n/a
Ubuntu18 x86_64 V V V n/a
Ubuntu20 x86_64 V V V V
Ubuntu22 x86_64 V V V V
RHEL8 x86_64 V V V n/a
=================== ===== ===== ===== =====
@ -228,13 +225,15 @@ Step 1: Download and Install the OpenVINO Core Components
Unlink the previous link with ``sudo unlink openvino_2023``, and then re-run the command above.
Congratulations, you have finished the installation! The ``/opt/intel/openvino_2023`` folder now contains
the core components for OpenVINO. If you used a different path in Step 2, for example, ``/home/<USER>/intel/``,
OpenVINO is now in ``/home/<USER>/intel/openvino_2023``. The path to the ``openvino_2023`` directory is
also referred as ``<INSTALL_DIR>`` throughout the OpenVINO documentation.
Congratulations, you have finished the installation! For some use cases you may still
need to install additional components. Check the description below, as well as the
:doc:`list of additional configurations <openvino_docs_install_guides_configurations_header>`
to see if your case needs any of them.
The ``/opt/intel/openvino_2023`` folder now contains the core components for OpenVINO.
If you used a different path in Step 2, for example, ``/home/<USER>/intel/``,
OpenVINO is now in ``/home/<USER>/intel/openvino_2023``. The path to the ``openvino_2023``
directory is also referred as ``<INSTALL_DIR>`` throughout the OpenVINO documentation.
Step 2: Configure the Environment
@ -263,6 +262,7 @@ The environment variables are set.
What's Next?
############################################################

View File

@ -11,28 +11,9 @@
Note that the Archive distribution:
* offers both C++ and Python APIs
* offers both C/C++ and Python APIs
* additionally includes code samples
* is dedicated to users of all major OSs: Windows, Linux, macOS
* may offer different hardware support under different operating systems
(see the drop-down below for more details)
.. dropdown:: Inference Options
=================== ===== ===== ===== ===== ======== ============= ======== ========
Operating System CPU GPU GNA NPU AUTO Auto-batch HETERO MULTI
=================== ===== ===== ===== ===== ======== ============= ======== ========
Debian9 armhf V n/a n/a n/a V V V n/a
Debian9 arm64 V n/a n/a n/a V V V n/a
CentOS7 x86_64 V V V n/a V V V V
Ubuntu18 x86_64 V V V n/a V V V V
Ubuntu20 x86_64 V V V V V V V V
Ubuntu22 x86_64 V V V V V V V V
RHEL8 x86_64 V V V n/a V V V V
Windows x86_64 V V V V V V V V
MacOS x86_64 V n/a n/a n/a V V V n/a
MacOS arm64 V n/a n/a n/a V V V n/a
=================== ===== ===== ===== ===== ======== ============= ======== ========
.. tab-set::
@ -126,10 +107,15 @@ Step 1: Install OpenVINO Core Components
If you have already installed a previous release of OpenVINO 2023, a symbolic link to the ``openvino_2023`` folder may already exist. Unlink the previous link with ``sudo unlink openvino_2023``, and then re-run the command above.
Congratulations, you have finished the installation! The ``/opt/intel/openvino_2023`` folder now contains
the core components for OpenVINO. If you used a different path in Step 2, for example, ``/home/<USER>/intel/``,
OpenVINO is now in ``/home/<USER>/intel/openvino_2023``. The path to the ``openvino_2023`` directory is
also referred as ``<INSTALL_DIR>`` throughout the OpenVINO documentation.
Congratulations, you have finished the installation! For some use cases you may still
need to install additional components. Check the description below, as well as the
:doc:`list of additional configurations <openvino_docs_install_guides_configurations_header>`
to see if your case needs any of them.
The ``/opt/intel/openvino_2023`` folder now contains the core components for OpenVINO.
If you used a different path in Step 2, for example, ``/home/<USER>/intel/``,
OpenVINO is now in ``/home/<USER>/intel/openvino_2023``. The path to the ``openvino_2023``
directory is also referred as ``<INSTALL_DIR>`` throughout the OpenVINO documentation.
Step 2: Configure the Environment

View File

@ -11,28 +11,9 @@
Note that the Archive distribution:
* offers both C++ and Python APIs
* offers both C/C++ and Python APIs
* additionally includes code samples
* is dedicated to users of all major OSs: Windows, Linux, macOS
* may offer different hardware support under different operating systems
(see the drop-down below for more details)
.. dropdown:: Inference Options
=================== ===== ===== ===== ===== ======== ============= ======== ========
Operating System CPU GPU GNA NPU AUTO Auto-batch HETERO MULTI
=================== ===== ===== ===== ===== ======== ============= ======== ========
Debian9 armhf V n/a n/a n/a V V V n/a
Debian9 arm64 V n/a n/a n/a V V V n/a
CentOS7 x86_64 V V V n/a V V V V
Ubuntu18 x86_64 V V V n/a V V V V
Ubuntu20 x86_64 V V V V V V V V
Ubuntu22 x86_64 V V V V V V V V
RHEL8 x86_64 V V V n/a V V V V
Windows x86_64 V V V V V V V V
MacOS x86_64 V n/a n/a n/a V V V n/a
MacOS arm64 V n/a n/a n/a V V V n/a
=================== ===== ===== ===== ===== ======== ============= ======== ========
System Requirements
@ -148,7 +129,17 @@ Step 1: Download and Install OpenVINO Core Components
If you have already installed a previous release of OpenVINO 2022, a symbolic link to the ``openvino_2023`` folder may already exist. If you want to override it, navigate to the ``C:\Program Files (x86)\Intel`` folder and delete the existing linked folder before running the ``mklink`` command.
Congratulations, you finished the installation! The ``C:\Program Files (x86)\Intel\openvino_2023`` folder now contains the core components for OpenVINO. If you used a different path in Step 1, you will find the ``openvino_2023`` folder there. The path to the ``openvino_2023`` directory is also referred as ``<INSTALL_DIR>`` throughout the OpenVINO documentation.
Congratulations, you have finished the installation! For some use cases you may still
need to install additional components. Check the description below, as well as the
:doc:`list of additional configurations <openvino_docs_install_guides_configurations_header>`
to see if your case needs any of them.
The ``C:\Program Files (x86)\Intel\openvino_2023`` folder now contains the core components for OpenVINO.
If you used a different path in Step 1, you will find the ``openvino_2023`` folder there.
The path to the ``openvino_2023`` directory is also referred as ``<INSTALL_DIR>``
throughout the OpenVINO documentation.
.. _set-the-environment-variables-windows:

View File

@ -17,7 +17,7 @@
Use APT <openvino_docs_install_guides_installing_openvino_apt>
Use YUM <openvino_docs_install_guides_installing_openvino_yum>
Use Conda Forge <openvino_docs_install_guides_installing_openvino_conda>
Use VCPKG <openvino_docs_install_guides_installing_openvino_vcpkg>
Use vcpkg <openvino_docs_install_guides_installing_openvino_vcpkg>
Use Homebrew <openvino_docs_install_guides_installing_openvino_brew>
Use Docker <openvino_docs_install_guides_installing_openvino_docker>

View File

@ -16,7 +16,7 @@
Using Homebrew <openvino_docs_install_guides_installing_openvino_brew>
From PyPI <openvino_docs_install_guides_installing_openvino_pip>
Using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>
Use VCPKG <openvino_docs_install_guides_installing_openvino_vcpkg>
Use vcpkg <openvino_docs_install_guides_installing_openvino_vcpkg>
If you want to install OpenVINO™ Runtime on macOS, there are a few ways to accomplish this. We prepared following options for you:

View File

@ -1,4 +1,4 @@
# Install Intel® Distribution of OpenVINO™ Toolkit {#openvino_docs_install_guides_overview}
# Install OpenVINO™ 2023.1 {#openvino_docs_install_guides_overview}
@sphinxdirective
@ -13,8 +13,7 @@
OpenVINO Runtime on Linux <openvino_docs_install_guides_installing_openvino_linux_header>
OpenVINO Runtime on Windows <openvino_docs_install_guides_installing_openvino_windows_header>
OpenVINO Runtime on macOS <openvino_docs_install_guides_installing_openvino_macos_header>
OpenVINO Development Tools <openvino_docs_install_guides_install_dev_tools>
OpenVINO Runtime on macOS <openvino_docs_install_guides_installing_openvino_macos_header>
Create a Yocto Image <openvino_docs_install_guides_installing_openvino_yocto>
@ -49,16 +48,12 @@
.. dropdown:: Distribution Comparison for OpenVINO 2023.1
=============== ========== ====== ========= ======== ============ ==========
Device Archives PyPI APT/YUM Conda Homebrew VCPKG
Device Archives PyPI APT/YUM Conda Homebrew vcpkg
=============== ========== ====== ========= ======== ============ ==========
CPU V V V V V V
GPU V V V V V V
GNA V V V V V V
NPU V V V V V V
Auto V V V V V V
Auto-Batch V V V V V V
Hetero V n/a n/a n/a n/a n/a
Multi V n/a n/a n/a n/a n/a
GNA V n/a n/a n/a n/a n/a
NPU V n/a n/a n/a n/a n/a
=============== ========== ====== ========= ======== ============ ==========
| **Build OpenVINO from source**

View File

@ -1,19 +1,18 @@
# Install OpenVINO™ Runtime via VCPKG {#openvino_docs_install_guides_installing_openvino_vcpkg}
# Install OpenVINO™ Runtime via vcpkg {#openvino_docs_install_guides_installing_openvino_vcpkg}
@sphinxdirective
.. meta::
:description: Learn how to install OpenVINO™ Runtime on Windows, Linux, and macOS
operating systems, using VCPKG.
operating systems, using vcpkg.
.. note::
Note that the VCPKG distribution:
Note that the vcpkg distribution:
* offers C++ API only
* offers C/C++ API only
* does not offer support for GNA and NPU inference
* is dedicated to users of all major OSs: Windows, Linux, macOS.
* may offer different hardware support under different operating systems.
.. tab-set::
@ -39,8 +38,8 @@
Installing OpenVINO Runtime
###########################
1. Make sure that you have installed VCPKG on your system. If not, follow the
`VCPKG installation instructions <https://vcpkg.io/en/getting-started>`__.
1. Make sure that you have installed vcpkg on your system. If not, follow the
`vcpkg installation instructions <https://vcpkg.io/en/getting-started>`__.
2. Install OpenVINO using the following terminal command:
@ -49,29 +48,35 @@ Installing OpenVINO Runtime
vcpkg install openvino
VCPKG also enables you to install only selected components, by specifying them in the command.
vcpkg also enables you to install only selected components, by specifying them in the command.
See the list of `available features <https://vcpkg.link/ports/openvino>`__, for example:
.. code-block:: sh
vcpkg install openvino[cpu,ir]
Note that the VCPKG installation means building all packages and dependencies from source,
Note that the vcpkg installation means building all packages and dependencies from source,
which means the compiler stage will require additional time to complete the process.
After installation, you can use OpenVINO in your product by running:
.. code-block:: sh
find_package(OpenVINO)
.. code-block:: sh
cmake -B [build directory] -S . -DCMAKE_TOOLCHAIN_FILE=[path to vcpkg]/scripts/buildsystems/vcpkg.cmake
Congratulations! You've just Installed OpenVINO! For some use cases you may still
need to install additional components. Check the
:doc:`list of additional configurations <openvino_docs_install_guides_configurations_header>`
to see if your case needs any of them.
Uninstalling OpenVINO
#####################
To uninstall OpenVINO via VCPKG, use the following command:
To uninstall OpenVINO via vcpkg, use the following command:
.. code-block:: sh

View File

@ -15,7 +15,7 @@
Use Archive <openvino_docs_install_guides_installing_openvino_from_archive_windows>
Use PyPI <openvino_docs_install_guides_installing_openvino_pip>
Use Conda Forge <openvino_docs_install_guides_installing_openvino_conda>
Use VCPKG <openvino_docs_install_guides_installing_openvino_vcpkg>
Use vcpkg <openvino_docs_install_guides_installing_openvino_vcpkg>
Use Docker <openvino_docs_install_guides_installing_openvino_docker>

View File

@ -10,7 +10,7 @@
Note that the YUM distribution:
* offers both C++ and Python APIs
* offers C/C++ APIs only
* does not offer support for GNA and NPU inference
* additionally includes code samples
* is dedicated to Linux users.
@ -129,7 +129,9 @@ Run the following command:
yum list installed 'openvino*'
.. note::
You can additionally install Python API using one of the alternative methods (:doc:`conda <openvino_docs_install_guides_installing_openvino_conda>` or :doc:`pip <openvino_docs_install_guides_installing_openvino_pip>`).
Congratulations! You've just Installed OpenVINO! For some use cases you may still
need to install additional components. Check the

View File

@ -7,7 +7,6 @@
.. toctree::
:maxdepth: 1
:hidden:
:caption: Pre-Trained Models
omz_models_group_intel
omz_models_group_public
@ -15,14 +14,15 @@
.. toctree::
:maxdepth: 1
:hidden:
:caption: Demo Applications
omz_tools_downloader
omz_tools_accuracy_checker
omz_data_datasets
omz_demos
.. toctree::
:maxdepth: 1
:hidden:
:caption: Model API
omz_model_api_ovms_adapter

View File

@ -8,7 +8,6 @@
basic_quantization_flow
quantization_w_accuracy_control
pot_introduction
Post-training model optimization is the process of applying special methods that transform the model into a more hardware-friendly representation without retraining or fine-tuning. The most popular and widely-spread method here is 8-bit post-training quantization because it is:

View File

@ -1,15 +0,0 @@
# Tuning Utilities {#openvino_docs_tuning_utilities}
@sphinxdirective
.. meta::
:description: Get to know Accuracy Checker - a deep learning accuracy validation framework and other tuning utilities found in OpenVINO™ toolkit.
.. toctree::
:maxdepth: 1
:caption: Tuning Utilities
omz_tools_accuracy_checker
omz_data_datasets
@endsphinxdirective