Polish the demos

1. Rename demo/ to demos/
2. Add demos/README.md
This commit is contained in:
Tate, Hongliang Tian 2019-10-19 02:04:13 +00:00
parent 6dc9906c8e
commit f9376ec4ba
27 changed files with 19 additions and 8 deletions

@ -111,7 +111,7 @@ Step 4-5 are to be done on the guest OS running inside the Docker container:
```
cd /opt/intel/sgxsdk/SampleCode/SampleEnclave && make && ./app
```
5. Check out Occlum's demos preinstalled at `/root/occlum/demos`, whose README can be found [here](demo/README.md). Or you can try to build and run your own SGX-protected applications using Occlum like these demos.
5. Check out Occlum's demos preinstalled at `/root/occlum/demos`, whose README can be found [here](demos/README.md). Or you can try to build and run your own SGX-protected applications using Occlum as shown in the demos.
## How to Build and Install

9
demos/README.md Normal file

@ -0,0 +1,9 @@
# Demos
This directory contains sample projects that demonstrate how Occlum can be used to build and run user applications.
* `hello_c/`: A sample C project built with Makefile/CMake.
* `hello_cc/`: A sample C++ project built with Makefile/CMake.
* `hello_bazel/`: A sample C++ project built with [Bazel](https://bazel.build).
* `https_server/`: A HTTPS file server based on [Mongoose Embedded Web Server Library](https://github.com/cesanta/mongoose).
* `tensorflow_lite/`: A demo and benchmark of [Tensorflow Lite](https://www.tensorflow.org/lite) inference engine.

@ -1,4 +1,4 @@
# C++ Sample Project with Bazel
# A Sample C++ Project with Bazel
This project demonstrates how to use Bazel to build C++ projects for Occlum. To install Bazel on Ubuntu, follow the instructions [here](https://docs.bazel.build/versions/master/install-ubuntu.html).

@ -1,4 +1,4 @@
# C Sample Project with Makefile and CMake
# A Sample C Project with Makefile and CMake
This project demonstrates how to use Makefile/CMake to build C projects for Occlum.

@ -1,4 +1,4 @@
# C++ Sample Project with Makefile and CMake
# A Sample C++ Project with Makefile and CMake
This project demonstrates how to use Makefile/CMake to build C++ projects for Occlum.

@ -1,4 +1,6 @@
# Use Tensorflow Lite in SGX with Occlum
# Use Tensorflow Lite with Occlum
This project demonstrates how Occlum enables [Tensorflow Lite](https://www.tensorflow.org/lite) in SGX enclaves.
Step 1: Download Tensorflow, build Tensorflow Lite, and download models
```
@ -16,12 +18,12 @@ Step 2.2: To run TensorFlow Lite inference benchmark in Occlum, run
./run_tflite_in_occlum.sh benchmark
```
Step 3.1: To run TensorFlow Lite inference demo in Linux, run
Step 3.1 (Optional): To run TensorFlow Lite inference demo in Linux, run
```
./run_tflite_in_linux.sh demo
```
Step 3.2: To run TensorFlow Lite inference benchmark in Linux, run
Step 3.2 (Optional): To run TensorFlow Lite inference benchmark in Linux, run
```
./run_tflite_in_linux.sh benchmark
```

@ -1,6 +1,6 @@
[package]
name = "Occlum"
version = "0.5.0"
version = "0.6.0"
[lib]
name = "occlum_rs"