diff --git a/README.md b/README.md
index 844e44a0ace..adc6f9f2b96 100644
--- a/README.md
+++ b/README.md
@@ -73,9 +73,9 @@ The OpenVINO™ Runtime can infer models on different hardware devices. This sec
Intel Xeon with Intel® Advanced Vector Extensions 2 (Intel® AVX2), Intel® Advanced Vector Extensions 512 (Intel® AVX-512), and AVX512_BF16, Intel Core Processors with Intel AVX2, Intel Atom Processors with Intel® Streaming SIMD Extensions (Intel® SSE) |
- ARM CPU
- | openvino_arm_cpu_plugin |
- Raspberry Pi™ 4 Model B, Apple® Mac mini with M1 chip, NVIDIA® Jetson Nano™, Android™ devices
+ | ARM CPU
+ | openvino_arm_cpu_plugin |
+ Raspberry Pi™ 4 Model B, Apple® Mac mini with Apple silicon
|
GPU |
diff --git a/docs/articles_en/about_openvino/compatibility_and_support/Supported_Devices.md b/docs/articles_en/about_openvino/compatibility_and_support/Supported_Devices.md
index 8f4f1833914..9d85463af54 100644
--- a/docs/articles_en/about_openvino/compatibility_and_support/Supported_Devices.md
+++ b/docs/articles_en/about_openvino/compatibility_and_support/Supported_Devices.md
@@ -30,7 +30,7 @@ Currently, processors of the 11th generation and later (up to the 13th generatio
|| | Intel® Core™ Processors with Intel® AVX2, |
|| | Intel® Atom® Processors with Intel® Streaming SIMD Extensions (Intel® SSE) |
|| | |
-|| (Arm®) | Raspberry Pi™ 4 Model B, Apple® Mac mini with M1 chip, NVIDIA® Jetson Nano™, Android™ devices |
+|| (Arm®) | Raspberry Pi™ 4 Model B, Apple® Mac mini with Apple silicon |
|| | |
+---------------------------------------------------------------------+------------------------------------------------------------------------------------------------------+
|| :doc:`GPU ` | Intel® Processor Graphics including Intel® HD Graphics and Intel® Iris® Graphics, |
diff --git a/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins.md b/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins.md
index 53778ab9b7f..3849109927c 100644
--- a/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins.md
+++ b/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins.md
@@ -24,6 +24,7 @@ OpenVINO™ Runtime can infer deep learning models using the following device ty
* :doc:`CPU `
* :doc:`GPU `
* :doc:`GNA `
+* :doc:`Arm® CPU `
For a more detailed list of hardware, see :doc:`Supported Devices `.
diff --git a/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins/CPU.md b/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins/CPU.md
index 04ecd9b2222..320cec2583e 100644
--- a/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins/CPU.md
+++ b/docs/articles_en/openvino_workflow/openvino_intro/Device_Plugins/CPU.md
@@ -52,7 +52,7 @@ CPU plugin supports the following data types as inference precision of internal
- Floating-point data types:
- ``f32`` (Intel® x86-64, Arm®)
- - ``bf16``(Intel® x86-64)
+ - ``bf16`` (Intel® x86-64)
- Integer data types:
- ``i32`` (Intel® x86-64, Arm®)