[OV-DOC] Issues with pypi-openvino-dev.md (#6343)
* [OV-DOC] Issues with pypi-openvino-dev.md * add example for pip install on Ubuntu * Update docs/install_guides/pypi-openvino-dev.md Co-authored-by: Tatiana Savina <tatiana.savina@intel.com> Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>
This commit is contained in:
parent
b893eae9e7
commit
79a6ad0c0f
@ -15,11 +15,11 @@ The **developer package** includes the following components installed by default
|
||||
|
||||
| Component | Console Script | Description |
|
||||
|------------------|---------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| [Model Optimizer](https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html) | `mo` |**Model Optimizer** imports, converts, and optimizes models that were trained in popular frameworks to a format usable by Intel tools, especially the Inference Engine. <br>Popular frameworks include Caffe\*, TensorFlow\*, MXNet\*, and ONNX\*. |
|
||||
| [Model Optimizer](https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html) | `mo` |**Model Optimizer** imports, converts, and optimizes models that were trained in popular frameworks to a format usable by Intel tools, especially the Inference Engine. <br>Supported frameworks include Caffe\*, TensorFlow\*, MXNet\*, and ONNX\*. |
|
||||
| [Benchmark Tool](https://docs.openvinotoolkit.org/latest/openvino_inference_engine_tools_benchmark_tool_README.html)| `benchmark_app` | **Benchmark Application** allows you to estimate deep learning inference performance on supported devices for synchronous and asynchronous modes. |
|
||||
| [Accuracy Checker](https://docs.openvinotoolkit.org/latest/omz_tools_accuracy_checker.html) and <br> [Annotation Converter](https://docs.openvinotoolkit.org/latest/omz_tools_accuracy_checker_annotation_converters.html) | `accuracy_check` <br> `convert_annotation` |**Accuracy Checker** is a deep learning accuracy validation tool that allows you to collect accuracy metrics against popular datasets. The main advantages of the tool are the flexibility of configuration and an impressive set of supported datasets, preprocessing, postprocessing, and metrics. <br> **Annotation Converter** is a utility for offline conversion of datasets to the format suitable for metric evaluation used in Accuracy Checker. |
|
||||
| [Accuracy Checker](https://docs.openvinotoolkit.org/latest/omz_tools_accuracy_checker.html) and <br> [Annotation Converter](https://docs.openvinotoolkit.org/latest/omz_tools_accuracy_checker_annotation_converters.html) | `accuracy_check` <br> `convert_annotation` |**Accuracy Checker** is a deep learning accuracy validation tool that allows you to collect accuracy metrics against popular datasets. The main advantages of the tool are the flexibility of configuration and a set of supported datasets, preprocessing, postprocessing, and metrics. <br> **Annotation Converter** is a utility that prepares datasets for evaluation with Accuracy Checker. |
|
||||
| [Post-Training Optimization Tool](https://docs.openvinotoolkit.org/latest/pot_README.html)| `pot` |**Post-Training Optimization Tool** allows you to optimize trained models with advanced capabilities, such as quantization and low-precision optimizations, without the need to retrain or fine-tune models. Optimizations are also available through the [API](https://docs.openvinotoolkit.org/latest/pot_compression_api_README.html). |
|
||||
| [Model Downloader and other Open Model Zoo tools](https://docs.openvinotoolkit.org/latest/omz_tools_downloader.html)| `omz_downloader` <br> `omz_converter` <br> `omz_quantizer` <br> `omz_info_dumper`| **Model Downloader** is a tool for getting access to the collection of high-quality and extremely fast pre-trained deep learning [public](https://docs.openvinotoolkit.org/latest/omz_models_group_public.html) and [intel](https://docs.openvinotoolkit.org/latest/omz_models_group_intel.html)-trained models. Use these free pre-trained models instead of training your own models to speed up the development and production deployment process. The principle of the tool is as follows: it downloads model files from online sources and, if necessary, patches them with Model Optimizer to make them more usable. A number of additional tools are also provided to automate the process of working with downloaded models:<br> **Model Converter** is a tool for converting the models stored in a format other than the Intermediate Representation (IR) into that format using Model Optimizer. <br> **Model Quantizer** is a tool for automatic quantization of full-precision IR models into low-precision versions using Post-Training Optimization Tool. <br> **Model Information Dumper** is a helper utility for dumping information about the models in a stable machine-readable format.|
|
||||
| [Model Downloader and other Open Model Zoo tools](https://docs.openvinotoolkit.org/latest/omz_tools_downloader.html)| `omz_downloader` <br> `omz_converter` <br> `omz_quantizer` <br> `omz_info_dumper`| **Model Downloader** is a tool for getting access to the collection of high-quality and extremely fast pre-trained deep learning [public](https://docs.openvinotoolkit.org/latest/omz_models_group_public.html) and [Intel](https://docs.openvinotoolkit.org/latest/omz_models_group_intel.html)-trained models. These free pre-trained models can be used to speed up the development and production deployment process without training your own models. The tool downloads model files from online sources and, if necessary, patches them to make them more usable with Model Optimizer. A number of additional tools are also provided to automate the process of working with downloaded models:<br> **Model Converter** is a tool for converting Open Model Zoo models that are stored in an original deep learning framework format into the Inference Engine Intermediate Representation (IR) using Model Optimizer. <br> **Model Quantizer** is a tool for automatic quantization of full-precision models in the IR format into low-precision versions using the Post-Training Optimization Tool. <br> **Model Information Dumper** is a helper utility for dumping information about the models to a stable, machine-readable format.
|
||||
|
||||
|
||||
**Developer package** also provides the **runtime package** installed as a dependency. The runtime package includes the following components:
|
||||
@ -54,17 +54,14 @@ To avoid dependency conflicts, use a virtual environment. Skip this
|
||||
|
||||
Create virtual environment:
|
||||
|
||||
On Linux and macOS:
|
||||
```sh
|
||||
# Depending on your OS, this step may require installing python3-venv
|
||||
python3 -m venv openvino_env
|
||||
```
|
||||
|
||||
On Windows:
|
||||
```sh
|
||||
python -m pip install --user virtualenv
|
||||
python -m venv openvino_env
|
||||
```
|
||||
|
||||
> **NOTE**: On Linux and macOS, you may need to type `python3` instead of
|
||||
`python`. You may also need to [install pip](https://pip.pypa.io/en/stable/installing/). For example, on Ubuntu execute the following command to get pip installed: `sudo apt install python3-venv python3-pip`.
|
||||
|
||||
### Step 2. Activate Virtual Environment
|
||||
|
||||
On Linux and macOS:
|
||||
@ -87,11 +84,11 @@ python -m pip install --upgrade pip
|
||||
|
||||
To install and configure the components of the development package for working with specific frameworks, use the `pip install openvino-dev[extras]` command, where `extras` is a list of extras from the table below:
|
||||
|
||||
| DL Framework | Extra |
|
||||
| DL Framework | Extra |
|
||||
| :------------------------------------------------------------------------------- | :-------------------------------|
|
||||
| [Caffe*](https://caffe.berkeleyvision.org/) | caffe |
|
||||
| [Caffe2*](https://caffe2.ai/) | caffe2 |
|
||||
| [Kaldi*](https://kaldi-asr.org/) | kaldi |
|
||||
| [Caffe2*](https://github.com/pytorch/pytorch) | caffe2 |
|
||||
| [Kaldi*](https://github.com/kaldi-asr/kaldi) | kaldi |
|
||||
| [MXNet*](https://mxnet.apache.org/) | mxnet |
|
||||
| [ONNX*](https://github.com/microsoft/onnxruntime/) | onnx |
|
||||
| [PyTorch*](https://pytorch.org/) | pytorch |
|
||||
|
@ -51,7 +51,7 @@ python -m venv openvino_env
|
||||
```
|
||||
|
||||
> **NOTE**: On Linux and macOS, you may need to type `python3` instead of
|
||||
`python`. You may also need to [install pip](https://pip.pypa.io/en/stable/installing/).
|
||||
`python`. You may also need to [install pip](https://pip.pypa.io/en/stable/installing/). For example, on Ubuntu execute the following command to get pip installed: `sudo apt install python3-venv python3-pip`.
|
||||
|
||||
### Step 2. Activate Virtual Environment
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user