occlum/demo/tensorflow_lite
LI Qing ba7db98e49 Add Tensorflow Lite Demo
The demo shows how to run the unmodified Tensorflow Lite on Occlum.
2019-10-14 08:55:45 +00:00
..
patch Add Tensorflow Lite Demo 2019-10-14 08:55:45 +00:00
.gitignore Add Tensorflow Lite Demo 2019-10-14 08:55:45 +00:00
download_and_build_tflite.sh Add Tensorflow Lite Demo 2019-10-14 08:55:45 +00:00
README.md Add Tensorflow Lite Demo 2019-10-14 08:55:45 +00:00
run_tflite_in_linux.sh Add Tensorflow Lite Demo 2019-10-14 08:55:45 +00:00
run_tflite_in_occlum.sh Add Tensorflow Lite Demo 2019-10-14 08:55:45 +00:00

Use Tensorflow Lite in SGX with Occlum

Step 1: Download Tensorflow, build Tensorflow Lite, and download models

./download_and_build_tflite.sh

When completed, the resulting Tensorflow can be found in tensorflow_src directory, the Tensorflow Lite Model can be found in models directory

Step 2.1: To run TensorFlow Lite inference demo in Occlum, run

./run_tflite_in_occlum.sh demo

Step 2.2: To run TensorFlow Lite inference benchmark in Occlum, run

./run_tflite_in_occlum.sh benchmark

Step 3.1: To run TensorFlow Lite inference demo in Linux, run

./run_tflite_in_linux.sh demo

Step 3.2: To run TensorFlow Lite inference benchmark in Linux, run

./run_tflite_in_linux.sh benchmark