occlum/demos/README.md
LI Qing b9fa937504 Add Intel OpenVINO demo
This demo shows how to run Intel OpenVINO Inference Engine on Occlum inside SGX enclaves
2019-12-10 06:07:27 +00:00

12 lines
787 B
Markdown

# Demos
This directory contains sample projects that demonstrate how Occlum can be used to build and run user applications.
* `hello_c/`: A sample C project built with Makefile/CMake.
* `hello_cc/`: A sample C++ project built with Makefile/CMake.
* `hello_bazel/`: A sample C++ project built with [Bazel](https://bazel.build).
* `https_server/`: A HTTPS file server based on [Mongoose Embedded Web Server Library](https://github.com/cesanta/mongoose).
* `openvino/` A benchmark of [OpenVINO Inference Engine](https://docs.openvinotoolkit.org/2019_R3/_docs_IE_DG_inference_engine_intro.html).
* `tensorflow_lite/`: A demo and benchmark of [Tensorflow Lite](https://www.tensorflow.org/lite) inference engine.
* `xgboost/`: A demo of [XGBoost](https://xgboost.readthedocs.io/en/latest/).