2022-02-22 20:11:42 +03:00
# Model Creation C++ Sample {#openvino_inference_engine_samples_model_creation_sample_README}
2020-04-13 21:17:23 +03:00
2022-02-14 19:03:19 +03:00
This sample demonstrates how to execute an synchronous inference using [model ](../../../docs/OV_Runtime_UG/model_representation.md ) built on the fly which uses weights from LeNet classification model, which is known to work well on digit classification tasks.
2020-04-13 21:17:23 +03:00
2022-02-22 20:11:42 +03:00
You do not need an XML file to create a model. The API of ov::Model allows creating a model on the fly from the source code.
2020-04-13 21:17:23 +03:00
2022-02-14 19:03:19 +03:00
The following C++ API is used in the application:
2021-04-15 13:42:46 +03:00
2022-02-14 19:03:19 +03:00
| Feature | API | Description |
| :--- | :--- | :--- |
| OpenVINO Runtime Info | `ov::Core::get_versions` | Get device plugins versions |
| Shape Operations | `ov::Output::get_shape` , `ov::Shape::size` , `ov::shape_size` | Operate with shape |
| Tensor Operations | `ov::Tensor::get_byte_size` , `ov::Tensor:data` | Get tensor byte size and its data |
| Model Operations | `ov::set_batch` | Operate with model batch size |
| Infer Request Operations | `ov::InferRequest::get_input_tensor` | Get a input tensor |
2022-02-22 20:11:42 +03:00
| Model creation objects | `ov::opset8::Parameter` , `ov::Node::output` , `ov::opset8::Constant` , `ov::opset8::Convolution` , `ov::opset8::Add` , `ov::opset1::MaxPool` , `ov::opset8::Reshape` , `ov::opset8::MatMul` , `ov::opset8::Relu` , `ov::opset8::Softmax` , `ov::descriptor::Tensor::set_names` , `ov::opset8::Result` , `ov::Model` , `ov::ParameterVector::vector` | Used to construct an OpenVINO model |
2021-04-15 13:42:46 +03:00
2022-02-14 19:03:19 +03:00
Basic OpenVINO™ Runtime API is covered by [Hello Classification C++ sample ](../hello_classification/README.md ).
2020-04-13 21:17:23 +03:00
2022-02-14 19:03:19 +03:00
| Options | Values |
| :--- | :--- |
| Validated Models | LeNet |
| Model Format | model weights file (\*.bin) |
| Validated images | single-channel `MNIST ubyte` images |
| Supported devices | [All ](../../../docs/OV_Runtime_UG/supported_plugins/Supported_Devices.md ) |
2022-02-22 20:11:42 +03:00
| Other language realization | [Python ](../../../samples/python/model_creation_sample/README.md ) |
2021-04-15 13:42:46 +03:00
## How It Works
2020-04-13 21:17:23 +03:00
2022-02-14 19:03:19 +03:00
At startup, the sample application does the following:
- Reads command line parameters
- [Build a Model ](../../../docs/OV_Runtime_UG/model_representation.md ) and passed weights file
- Loads the model and input data to the OpenVINO™ Runtime plugin
- Performs synchronous inference and processes output data, logging each step in a standard output stream
2020-04-13 21:17:23 +03:00
2022-02-14 19:03:19 +03:00
You can see the explicit description of each sample step at [Integration Steps ](../../../docs/OV_Runtime_UG/Integrate_with_customer_application_new_API.md ) section of "Integrate the OpenVINO™ Runtime with Your Application" guide.
2020-04-13 21:17:23 +03:00
2021-04-15 13:42:46 +03:00
## Building
2020-04-13 21:17:23 +03:00
2022-02-14 19:03:19 +03:00
To build the sample, please use instructions available at [Build the Sample Applications ](../../../docs/OV_Runtime_UG/Samples_Overview.md ) section in OpenVINO™ Toolkit Samples guide.
2020-04-13 21:17:23 +03:00
## Running
2022-02-14 19:03:19 +03:00
```
2022-02-22 20:11:42 +03:00
model_creation_sample < path_to_lenet_weights > < device >
2022-02-14 19:03:19 +03:00
```
2021-04-15 13:42:46 +03:00
> **NOTES**:
>
2022-02-14 19:03:19 +03:00
> - you can use LeNet model weights in the sample folder: `lenet.bin` with FP32 weights file
2021-04-15 13:42:46 +03:00
> - The `lenet.bin` with FP32 weights file was generated by the [Model Optimizer](../../../docs/MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md) tool from the public LeNet model with the `--input_shape [64,1,28,28]` parameter specified.
>
> The original model is available in the [Caffe* repository](https://github.com/BVLC/caffe/tree/master/examples/mnist) on GitHub\*.
You can do inference of an image using a pre-trained model on a GPU using the following command:
2021-07-14 12:07:34 +03:00
```
2022-02-22 20:11:42 +03:00
model_creation_sample lenet.bin GPU
2020-04-13 21:17:23 +03:00
```
## Sample Output
2021-04-15 13:42:46 +03:00
The sample application logs each step in a standard output stream and outputs top-10 inference results.
```
2022-02-14 19:03:19 +03:00
[ INFO ] OpenVINO Runtime version ......... < version >
[ INFO ] Build ........... < build >
[ INFO ]
2021-04-15 13:42:46 +03:00
[ INFO ] Device info:
2022-02-14 19:03:19 +03:00
[ INFO ] GPU
[ INFO ] Intel GPU plugin version ......... < version >
[ INFO ] Build ........... < build >
[ INFO ]
[ INFO ]
[ INFO ] Create model from weights: lenet.bin
[ INFO ] model name: lenet
[ INFO ] inputs
[ INFO ] input name: NONE
[ INFO ] input type: f32
[ INFO ] input shape: {64, 1, 28, 28}
[ INFO ] outputs
[ INFO ] output name: output_tensor
[ INFO ] output type: f32
[ INFO ] output shape: {64, 10}
[ INFO ] Batch size is 10
[ INFO ] model name: lenet
[ INFO ] inputs
[ INFO ] input name: NONE
[ INFO ] input type: u8
[ INFO ] input shape: {10, 28, 28, 1}
[ INFO ] outputs
[ INFO ] output name: output_tensor
[ INFO ] output type: f32
[ INFO ] output shape: {10, 10}
[ INFO ] Compiling a model for the GPU device
2021-04-15 13:42:46 +03:00
[ INFO ] Create infer request
2022-02-14 19:03:19 +03:00
[ INFO ] Combine images in batch and set to input tensor
[ INFO ] Start sync inference
[ INFO ] Processing output tensor
Top 1 results:
Image 0
classid probability label
------- ----------- -----
0 1.0000000 0
Image 1
classid probability label
------- ----------- -----
1 1.0000000 1
Image 2
classid probability label
------- ----------- -----
2 1.0000000 2
Image 3
classid probability label
------- ----------- -----
3 1.0000000 3
Image 4
classid probability label
------- ----------- -----
4 1.0000000 4
Image 5
classid probability label
------- ----------- -----
5 1.0000000 5
Image 6
classid probability label
------- ----------- -----
6 1.0000000 6
Image 7
classid probability label
------- ----------- -----
7 1.0000000 7
2021-04-15 13:42:46 +03:00
2022-02-14 19:03:19 +03:00
Image 8
2021-04-15 13:42:46 +03:00
2022-02-14 19:03:19 +03:00
classid probability label
------- ----------- -----
8 1.0000000 8
2021-04-15 13:42:46 +03:00
2022-02-14 19:03:19 +03:00
Image 9
2021-04-15 13:42:46 +03:00
2022-02-14 19:03:19 +03:00
classid probability label
------- ----------- -----
9 1.0000000 9
2021-04-15 13:42:46 +03:00
```
2020-04-13 21:17:23 +03:00
2020-07-20 17:36:08 +03:00
## Deprecation Notice
< table >
< tr >
< td > < strong > Deprecation Begins< / strong > < / td >
< td > June 1, 2020< / td >
< / tr >
< tr >
< td > < strong > Removal Date< / strong > < / td >
< td > December 1, 2020< / td >
< / tr >
2021-04-15 13:42:46 +03:00
< / table >
2020-07-20 17:36:08 +03:00
2020-04-13 21:17:23 +03:00
## See Also
2022-02-14 19:03:19 +03:00
- [Integrate the OpenVINO™ Runtime with Your Application ](../../../docs/OV_Runtime_UG/Integrate_with_customer_application_new_API.md )
- [Using OpenVINO™ Toolkit Samples ](../../../docs/OV_Runtime_UG/Samples_Overview.md )
2021-04-15 13:42:46 +03:00
- [Model Optimizer ](../../../docs/MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md )