[C API DOC] update C API user guide pipeline (#13290)
* [C API DOC] update C API user guide pipeline
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] C API user guide pipeline add Link & Build part for C
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] C API user guide pipeline add objects release for C
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] C API user guide pipeline clear annotates
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] fix build error
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] Reconstruct the guide about integration with OpenVINO Runtime for C
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* Revert "[C API DOC] Reconstruct the guide about integration with OpenVINO Runtime for C"
This reverts commit 9552054e7e
.
* [C API DOC] split project structure & Cmake
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] using correct input shape
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] align with C++ code example
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
* [C API DOC] fix build error
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
Signed-off-by: xuejun <Xuejun.Zhai@intel.com>
This commit is contained in:
parent
dafa67cf27
commit
4c329fbe5a
@ -36,6 +36,12 @@ Include next files to work with OpenVINO™ Runtime:
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c include
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
Use the following code to create OpenVINO™ Core to manage available devices and read model objects:
|
||||
@ -54,6 +60,12 @@ Use the following code to create OpenVINO™ Core to manage available devices an
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c part1
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
## Step 2. Compile the Model
|
||||
@ -128,6 +140,38 @@ Compile the model for a specific device using `ov::Core::compile_model()`:
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@sphinxtabset
|
||||
|
||||
@sphinxtab{IR}
|
||||
|
||||
@snippet docs/snippets/src/main.c part2_1
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{ONNX}
|
||||
|
||||
@snippet docs/snippets/src/main.c part2_2
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{PaddlePaddle}
|
||||
|
||||
@snippet docs/snippets/src/main.c part2_3
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{ov::Model}
|
||||
|
||||
@snippet docs/snippets/src/main.c part2_4
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
The `ov::Model` object represents any models inside the OpenVINO™ Runtime.
|
||||
@ -155,6 +199,12 @@ To learn how to change the device configuration, read the [Query device properti
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c part3
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
## Step 4. Set Inputs
|
||||
@ -175,6 +225,12 @@ You can use external memory to create `ov::Tensor` and use the `ov::InferRequest
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c part4
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
## Step 5. Start Inference
|
||||
@ -195,6 +251,12 @@ OpenVINO™ Runtime supports inference in either synchronous or asynchronous mod
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c part5
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
This section demonstrates a simple pipeline. To get more information about other ways to perform inference, read the dedicated ["Run inference" section](./ov_infer_request.md).
|
||||
@ -217,28 +279,72 @@ Go over the output tensors and process the inference results.
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c part6
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
## Step 7. Link and Build Your Application with OpenVINO™ Runtime (example)
|
||||
## Step 7. Release the allocated objects (only for C)
|
||||
|
||||
This step may differ for different projects. In this example, a C++ application is used, together with CMake for project configuration.
|
||||
To avoid memory leak, applications developed with C API need to release the allocated objects in order.
|
||||
|
||||
@sphinxtabset
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c part8
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
## Step 8. Link and Build Your Application with OpenVINO™ Runtime (example)
|
||||
|
||||
This step may differ for different projects. In this example, a C++ & C application is used, together with CMake for project configuration.
|
||||
|
||||
### Create Structure for project:
|
||||
|
||||
@sphinxtabset
|
||||
|
||||
@sphinxtab{C++}
|
||||
|
||||
@snippet docs/snippets/src/main.cpp part7
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet docs/snippets/src/main.c part7
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
### Create Cmake Script
|
||||
|
||||
For details on additional CMake build options, refer to the [CMake page](https://cmake.org/cmake/help/latest/manual/cmake.1.html#manual:cmake(1)).
|
||||
|
||||
### Create a structure for the project:
|
||||
``` sh
|
||||
project/
|
||||
├── CMakeLists.txt - CMake file to build
|
||||
├── ... - Additional folders like includes/
|
||||
└── src/ - source folder
|
||||
└── main.cpp
|
||||
build/ - build directory
|
||||
...
|
||||
```
|
||||
@sphinxtabset
|
||||
|
||||
### Include OpenVINO™ Runtime libraries in `project/CMakeLists.txt`
|
||||
@sphinxtab{C++}
|
||||
|
||||
@snippet snippets/CMakeLists.txt cmake:integration_example
|
||||
@snippet snippets/CMakeLists.txt cmake:integration_example_cpp
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@sphinxtab{C}
|
||||
|
||||
@snippet snippets/CMakeLists.txt cmake:integration_example_c
|
||||
|
||||
@endsphinxtab
|
||||
|
||||
@endsphinxtabset
|
||||
|
||||
|
||||
### Build Project
|
||||
|
||||
To build your project using CMake with the default build tools currently available on your machine, execute the following commands:
|
||||
|
||||
|
@ -115,7 +115,7 @@ target_link_libraries(${TARGET_NAME} PRIVATE openvino::runtime openvino::runtime
|
||||
#
|
||||
|
||||
set(TARGET_NAME "ov_integration_snippet")
|
||||
# [cmake:integration_example]
|
||||
# [cmake:integration_example_cpp]
|
||||
cmake_minimum_required(VERSION 3.10)
|
||||
set(CMAKE_CXX_STANDARD 11)
|
||||
|
||||
@ -124,4 +124,18 @@ find_package(OpenVINO REQUIRED)
|
||||
add_executable(${TARGET_NAME} src/main.cpp)
|
||||
|
||||
target_link_libraries(${TARGET_NAME} PRIVATE openvino::runtime)
|
||||
# [cmake:integration_example]
|
||||
|
||||
# [cmake:integration_example_cpp]
|
||||
|
||||
set(TARGET_NAME_C "ov_integration_snippet_c")
|
||||
# [cmake:integration_example_c]
|
||||
cmake_minimum_required(VERSION 3.10)
|
||||
set(CMAKE_CXX_STANDARD 11)
|
||||
|
||||
find_package(OpenVINO REQUIRED)
|
||||
|
||||
add_executable(${TARGET_NAME_C} src/main.c)
|
||||
|
||||
target_link_libraries(${TARGET_NAME_C} PRIVATE openvino::runtime::c)
|
||||
|
||||
# [cmake:integration_example_c]
|
||||
|
100
docs/snippets/src/main.c
Normal file
100
docs/snippets/src/main.c
Normal file
@ -0,0 +1,100 @@
|
||||
// Copyright (C) 2018-2022 Intel Corporation
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
//
|
||||
|
||||
//! [include]
|
||||
#include <openvino/c/openvino.h>
|
||||
//! [include]
|
||||
|
||||
int main() {
|
||||
//! [part1]
|
||||
ov_core_t* core = NULL;
|
||||
ov_core_create(&core);
|
||||
//! [part1]
|
||||
|
||||
{
|
||||
//! [part2_1]
|
||||
ov_compiled_model_t* compiled_model = NULL;
|
||||
ov_core_compile_model_from_file(core, "model.xml", "AUTO", 0, &compiled_model);
|
||||
//! [part2_1]
|
||||
}
|
||||
{
|
||||
//! [part2_2]
|
||||
ov_compiled_model_t* compiled_model = NULL;
|
||||
ov_core_compile_model_from_file(core, "model.onnx", "AUTO", 0, &compiled_model);
|
||||
//! [part2_2]
|
||||
}
|
||||
{
|
||||
//! [part2_3]
|
||||
ov_compiled_model_t* compiled_model = NULL;
|
||||
ov_core_compile_model_from_file(core, "model.pdmodel", "AUTO", 0, &compiled_model);
|
||||
//! [part2_3]
|
||||
}
|
||||
|
||||
//! [part2_4]
|
||||
// Construct a model
|
||||
ov_model_t* model = NULL;
|
||||
ov_core_read_model(core, "model.xml", NULL, &model);
|
||||
ov_compiled_model_t* compiled_model = NULL;
|
||||
ov_core_compile_model(core, model, "AUTO", 0, &compiled_model);
|
||||
//! [part2_4]
|
||||
|
||||
|
||||
//! [part3]
|
||||
ov_infer_request_t* infer_request = NULL;
|
||||
ov_compiled_model_create_infer_request(compiled_model, &infer_request);
|
||||
//! [part3]
|
||||
|
||||
void * memory_ptr = NULL;
|
||||
//! [part4]
|
||||
// Get input port for model with one input
|
||||
ov_output_const_port_t* input_port = NULL;
|
||||
ov_compiled_model_input(compiled_model, &input_port);
|
||||
// Get the input shape from input port
|
||||
ov_shape_t input_shape;
|
||||
ov_port_get_shape(input_port, &input_shape);
|
||||
// Get the the type of input
|
||||
ov_element_type_e input_type;
|
||||
ov_port_get_element_type(input_port, &input_type);
|
||||
// Create tensor from external memory
|
||||
ov_tensor_t* tensor = NULL;
|
||||
ov_tensor_create_from_host_ptr(input_type, input_shape, memory_ptr, &tensor);
|
||||
// Set input tensor for model with one input
|
||||
ov_infer_request_set_input_tensor(infer_request, tensor);
|
||||
//! [part4]
|
||||
|
||||
//! [part5]
|
||||
ov_infer_request_start_async(infer_request);
|
||||
ov_infer_request_wait(infer_request);
|
||||
//! [part5]
|
||||
|
||||
//! [part6]
|
||||
ov_tensor_t* output_tensor = NULL;
|
||||
// Get output tensor by tensor index
|
||||
ov_infer_request_get_output_tensor_by_index(infer_request, 0, &output_tensor);
|
||||
//! [part6]
|
||||
|
||||
//! [part8]
|
||||
ov_shape_free(&input_shape);
|
||||
ov_tensor_free(output_tensor);
|
||||
ov_output_const_port_free(input_port);
|
||||
ov_tensor_free(tensor);
|
||||
ov_infer_request_free(infer_request);
|
||||
ov_compiled_model_free(compiled_model);
|
||||
ov_model_free(model);
|
||||
ov_core_free(core);
|
||||
//! [part8]
|
||||
return 0;
|
||||
}
|
||||
/*
|
||||
//! [part7]
|
||||
project/
|
||||
├── CMakeLists.txt - CMake file to build
|
||||
├── ... - Additional folders like includes/
|
||||
└── src/ - source folder
|
||||
└── main.c
|
||||
build/ - build directory
|
||||
...
|
||||
|
||||
//! [part7]
|
||||
*/
|
@ -67,3 +67,15 @@ const float *output_buffer = output.data<const float>();
|
||||
//! [part6]
|
||||
return 0;
|
||||
}
|
||||
/*
|
||||
//! [part7]
|
||||
project/
|
||||
├── CMakeLists.txt - CMake file to build
|
||||
├── ... - Additional folders like includes/
|
||||
└── src/ - source folder
|
||||
└── main.cpp
|
||||
build/ - build directory
|
||||
...
|
||||
|
||||
//! [part7]
|
||||
*/
|
Loading…
Reference in New Issue
Block a user