* CPU device documentation refresh * Bfloat16 inference page aligned with the new API * Bfloat16 inference section moved to CPU main * First review comments applied * Second review step comments applied * OneDNN reference changed to the GitHub page * AvgPool added to the oneDNN ops list
10 lines
189 B
C++
10 lines
189 B
C++
#include <openvino/runtime/core.hpp>
|
|
|
|
int main() {
|
|
//! [part0]
|
|
ov::Core core;
|
|
auto cpuOptimizationCapabilities = core.get_property("CPU", ov::device::capabilities);
|
|
//! [part0]
|
|
return 0;
|
|
}
|