[AUTO] Fix mess table in doc (#11158)

Signed-off-by: Peter Chen <peter.chen@intel.com>
This commit is contained in:
Chen Peter
2022-03-23 18:35:28 +08:00
committed by GitHub
parent b50079143f
commit 066882579d

View File

@@ -27,17 +27,17 @@ The best device is chosen using the following logic:
| Choice || Supported || Supported |
| Priority || Device || model precision |
+==========+======================================================+=====================================+
| 1 || dGPU | FP32, FP16, INT8, BIN |
| || (e.g. Intel® Iris® Xe MAX) | |
| 1 || dGPU || FP32, FP16, INT8, BIN |
| || (e.g. Intel® Iris® Xe MAX) || |
+----------+------------------------------------------------------+-------------------------------------+
| 2 || iGPU | FP32, FP16, BIN |
| || (e.g. Intel® UHD Graphics 620 (iGPU)) | |
| 2 || iGPU || FP32, FP16, BIN |
| || (e.g. Intel® UHD Graphics 620 (iGPU)) || |
+----------+------------------------------------------------------+-------------------------------------+
| 3 || Intel® Movidius™ Myriad™ X VPU | FP16 |
| || (e.g. Intel® Neural Compute Stick 2 (Intel® NCS2)) | |
| 3 || Intel® Movidius™ Myriad™ X VPU || FP16 |
| || (e.g. Intel® Neural Compute Stick 2 (Intel® NCS2)) || |
+----------+------------------------------------------------------+-------------------------------------+
| 4 || Intel® CPU | FP32, FP16, INT8, BIN |
| || (e.g. Intel® Core™ i7-1165G7) | |
| 4 || Intel® CPU || FP32, FP16, INT8, BIN |
| || (e.g. Intel® Core™ i7-1165G7) || |
+----------+------------------------------------------------------+-------------------------------------+
What is important, **AUTO starts inference with the CPU by default except the priority list is set and there is no CPU in it**. CPU provides very low latency and can start inference with no additional delays. While it performs inference, the Auto-Device plugin continues to load the model to the device best suited for the purpose and transfers the task to it when ready. This way, the devices which are much slower in compile the model, GPU being the best example, do not impede inference at its initial stages.