diff --git a/docs/Doxyfile.config b/docs/Doxyfile.config index 7c3614c5f10..056f005fc73 100644 --- a/docs/Doxyfile.config +++ b/docs/Doxyfile.config @@ -843,16 +843,6 @@ INPUT = "@MARKDOWN_INPUT@" \ "@OpenVINO_SOURCE_DIR@/src/common/transformations/include/" \ "@OpenVINO_SOURCE_DIR@/src/common/util/include/" \ "@OpenVINO_SOURCE_DIR@/src/core/include/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/descriptor" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/op/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/op/util" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/opsets/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/pass/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/pattern/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/pattern/op/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/runtime/" \ - "@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/type/" \ "@OpenVINO_SOURCE_DIR@/src/core/include/openvino/" \ "@OpenVINO_SOURCE_DIR@/src/core/include/openvino/core/" \ "@OpenVINO_SOURCE_DIR@/src/core/include/openvino/core/descriptor/" \ diff --git a/docs/documentation.md b/docs/documentation.md index 8a3234ce8dc..ba726707a03 100644 --- a/docs/documentation.md +++ b/docs/documentation.md @@ -91,7 +91,7 @@ This section provides reference documents that guide you through developing your With the [Model Downloader](@ref omz_tools_downloader) and [Model Optimizer](MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md) guides, you will learn to download pre-trained models and convert them for use with the OpenVINO™ toolkit. You can provide your own model or choose a public or Intel model from a broad selection provided in the [Open Model Zoo](model_zoo.md). ## Deploying Inference -The [OpenVINO™ Runtime User Guide](OV_Runtime_UG/openvino_intro.md) explains the process of creating your own application that runs inference with the OpenVINO™ toolkit. The [API Reference](./api_references.html) defines the Inference Engine API for Python, C++, and C and the nGraph API for Python and C++. The Inference Engine API is what you'll use to create an OpenVINO™ application, while the nGraph API is available for using enhanced operations sets and other features. After writing your application, you can use the [Deployment Manager](install_guides/deployment-manager-tool.md) for deploying to target devices. +The [OpenVINO™ Runtime User Guide](OV_Runtime_UG/openvino_intro.md) explains the process of creating your own application that runs inference with the OpenVINO™ toolkit. The [API Reference](./api_references.html) defines the OpenVINO Runtime API for Python, C++, and C. The OpenVINO Runtime API is what you'll use to create an OpenVINO™ inference application, use enhanced operations sets and other features. After writing your application, you can use the [Deployment Manager](install_guides/deployment-manager-tool.md) for deploying to target devices. ## Tuning for Performance The toolkit provides a [Performance Optimization Guide](optimization_guide/dldt_optimization_guide.md) and utilities for squeezing the best performance out of your application, including [Accuracy Checker](@ref omz_tools_accuracy_checker), [Post-Training Optimization Tool](@ref pot_README), and other tools for measuring accuracy, benchmarking performance, and tuning your application. diff --git a/docs/ops/opset1.md b/docs/ops/opset1.md index 3eee791065d..fc4db9ce049 100644 --- a/docs/ops/opset1.md +++ b/docs/ops/opset1.md @@ -3,7 +3,7 @@ This specification document describes `opset1` operation set supported in OpenVINO. Support for each particular operation from the list below depends on the capabilities available in a inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset1`. diff --git a/docs/ops/opset2.md b/docs/ops/opset2.md index 04f9dfe048f..3ff00c6b762 100644 --- a/docs/ops/opset2.md +++ b/docs/ops/opset2.md @@ -3,7 +3,7 @@ This specification document describes `opset2` operation set supported in OpenVINO. Support for each particular operation from the list below depends on the capabilities available in a inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset2`. diff --git a/docs/ops/opset3.md b/docs/ops/opset3.md index 525e25d3449..dfdf64710be 100644 --- a/docs/ops/opset3.md +++ b/docs/ops/opset3.md @@ -3,7 +3,7 @@ This specification document describes `opset3` operation set supported in OpenVINO. Support for each particular operation from the list below depends on the capabilities available in a inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset3`. diff --git a/docs/ops/opset4.md b/docs/ops/opset4.md index b20fe4ac619..96e864bb5ae 100644 --- a/docs/ops/opset4.md +++ b/docs/ops/opset4.md @@ -3,7 +3,7 @@ This specification document describes `opset4` operation set supported in OpenVINO. Support for each particular operation from the list below depends on the capabilities available in a inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset4`. diff --git a/docs/ops/opset5.md b/docs/ops/opset5.md index f980a96a043..d0c6653a0c6 100644 --- a/docs/ops/opset5.md +++ b/docs/ops/opset5.md @@ -3,7 +3,7 @@ This specification document describes `opset5` operation set supported in OpenVINO. Support for each particular operation from the list below depends on the capabilities available in a inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset5`. diff --git a/docs/ops/opset6.md b/docs/ops/opset6.md index 3154484d56e..a2f35e51834 100644 --- a/docs/ops/opset6.md +++ b/docs/ops/opset6.md @@ -3,7 +3,7 @@ This specification document describes `opset6` operation set supported in OpenVINO. Support for each particular operation from the list below depends on the capabilities available in a inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset6`. diff --git a/docs/ops/opset7.md b/docs/ops/opset7.md index 8a786fc2b39..95a0734fa89 100644 --- a/docs/ops/opset7.md +++ b/docs/ops/opset7.md @@ -3,7 +3,7 @@ This specification document describes the `opset7` operation set supported in OpenVINO™. Support for each particular operation from the list below depends on the capabilities available in an inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are provided as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset7`. diff --git a/docs/ops/opset8.md b/docs/ops/opset8.md index a6274cdf968..70a9e98fecb 100644 --- a/docs/ops/opset8.md +++ b/docs/ops/opset8.md @@ -3,7 +3,7 @@ This specification document describes the `opset8` operation set supported in OpenVINO™. Support for each particular operation from the list below depends on the capabilities of an inference plugin and may vary among different hardware platforms and devices. Examples of operation instances are provided as IR V10 xml -snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes +snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes declared in `namespace opset8`.