Files
openvino/docs/MO_DG/prepare_model/convert_model/Converting_Model.md
Andrey Zaytsev 5e4cd1127b Integrate UAT fixes (#5517)
* Added info on DockerHub CI Framework

* Feature/azaytsev/change layout (#3295)

* Changes according to feedback comments

* Replaced @ref's with html links

* Fixed links, added a title page for installing from repos and images, fixed formatting issues

* Added links

* minor fix

* Added DL Streamer to the list of components installed by default

* Link fixes

* Link fixes

* ovms doc fix (#2988)

* added OpenVINO Model Server

* ovms doc fixes

Co-authored-by: Trawinski, Dariusz <dariusz.trawinski@intel.com>

* Updated openvino_docs.xml

* Edits to MO

Per findings spreadsheet

* macOS changes

per issue spreadsheet

* Fixes from review spreadsheet

Mostly IE_DG fixes

* Consistency changes

* Make doc fixes from last round of review

* integrate changes from baychub/master

* Update Intro.md

* Update Cutting_Model.md

* Update Cutting_Model.md

* Fixed link to Customize_Model_Optimizer.md

Co-authored-by: Trawinski, Dariusz <dariusz.trawinski@intel.com>
Co-authored-by: baychub <cbay@yahoo.com>
2021-05-06 15:37:13 +03:00

2.2 KiB

Converting a Model to Intermediate Representation (IR)

Use the mo.py script from the <INSTALL_DIR>/deployment_tools/model_optimizer directory to run the Model Optimizer and convert the model to the Intermediate Representation (IR). The simplest way to convert a model is to run mo.py with a path to the input model file and an output directory where you have write permissions:

python3 mo.py --input_model INPUT_MODEL --output_dir <OUTPUT_MODEL_DIR>

Note

: Some models require using additional arguments to specify conversion parameters, such as --scale, --scale_values, --mean_values, --mean_file. To learn about when you need to use these parameters, refer to Converting a Model Using General Conversion Parameters.

The mo.py script is the universal entry point that can deduce the framework that has produced the input model by a standard extension of the model file:

  • .caffemodel - Caffe* models
  • .pb - TensorFlow* models
  • .params - MXNet* models
  • .onnx - ONNX* models
  • .nnet - Kaldi* models.

If the model files do not have standard extensions, you can use the --framework {tf,caffe,kaldi,onnx,mxnet} option to specify the framework type explicitly.

For example, the following commands are equivalent:

python3 mo.py --input_model /user/models/model.pb
python3 mo.py --framework tf --input_model /user/models/model.pb

To adjust the conversion process, you may use general parameters defined in the Converting a Model Using General Conversion Parameters and Framework-specific parameters for:

See Also