occlum/demos/openvino
2023-04-18 13:12:37 +08:00
..
.gitignore Rename "occlum_context" to "occlum_instance" for demos 2020-08-15 19:12:39 +08:00
0001-Fix-passing-pre-increment-parameter-cpu-to-CPU_ISSET.patch Signed-off-by: 景涛 <jingtao3@hust.edu.cn> 2023-04-18 13:12:37 +08:00
download_and_build_openvino.sh Upgrade CI image to ubuntu20.04 2022-03-04 11:37:13 +08:00
download_openvino_model.sh Add Intel OpenVINO demo 2019-12-10 06:07:27 +00:00
install_cmake.sh Enhance the script of demos to support re-execute and other OS 2022-07-07 09:34:38 +08:00
openvino.yaml Fix bom tool autodep issue for openvino 2021-09-29 10:58:33 +08:00
README.md Add Intel OpenVINO demo 2019-12-10 06:07:27 +00:00
run_benchmark_on_occlum.sh Clean up unnecessary default_mmap_size configuration in demos 2022-12-21 23:19:25 +08:00

Use OpenVINO Inference Engine in SGX with Occlum

This project demonstrates how Occlum enables OpenVINO in SGX enclaves.

Step 1: Install CMake(3.15.5), because OpenVINO depends on a newer version of CMake

./install_cmake.sh

Step 2: Download OpenVINO and build the Inference Engine, it will also download and install OpenCV

./download_and_build_openvino.sh

When completed, the resulting OpenVINO can be found in openvino_src directory. Threading Building Blocks (TBB) is used by default. To use OpenMP, add option --threading OMP when invoking the script above.

Step 3: Download the example of OpenVINO models from 01.org

./download_openvino_model.sh

Step 4: Run OpenVINO Inference Engine benchmark app inside SGX enclave with Occlum

./run_benchmark_on_occlum.sh

Step 5 (Optional): Run OpenVINO Inference Engine benchmark app in Linux

./openvino_src/inference-engine/bin/intel64/Release/benchmark_app -m ./model/age-gender-recognition-retail-0013.xml