Add PaddlePaddle demo

This commit is contained in:
sunhe05 2023-03-07 02:47:45 +00:00 committed by volcano
parent 79bbd2ad3e
commit 2b98e1a076
8 changed files with 167 additions and 0 deletions

@ -831,3 +831,21 @@ jobs:
- name: Run fio test - name: Run fio test
run: docker exec ${{ github.job }} bash -c "cd /root/occlum/demos/benchmarks/fio && SGX_MODE=SIM ./run_fio_on_occlum.sh fio-seq-read.fio" run: docker exec ${{ github.job }} bash -c "cd /root/occlum/demos/benchmarks/fio && SGX_MODE=SIM ./run_fio_on_occlum.sh fio-seq-read.fio"
PaddlePaddle_test:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v1
with:
submodules: true
- uses: ./.github/workflows/composite_action/sim
with:
container-name: ${{ github.job }}
build-envs: 'OCCLUM_RELEASE_BUILD=1'
- name: Build python and paddlepaddle
run: docker exec ${{ github.job }} bash -c "cd /root/occlum/demos/paddlepaddle; ./install_python_with_conda.sh"
- name: Run paddlepaddle test
run: docker exec ${{ github.job }} bash -c "cd /root/occlum/demos/paddlepaddle; SGX_MODE=SIM ./run_paddlepaddle_on_occlum.sh"

@ -24,6 +24,7 @@ This set of demos shows how real-world apps can be easily run inside SGX enclave
* [mysql](mysql/): A demo of [MySQL](https://www.mysql.com/). * [mysql](mysql/): A demo of [MySQL](https://www.mysql.com/).
* [openvino](openvino/) A benchmark of [OpenVINO Inference Engine](https://docs.openvinotoolkit.org/2019_R3/_docs_IE_DG_inference_engine_intro.html). * [openvino](openvino/) A benchmark of [OpenVINO Inference Engine](https://docs.openvinotoolkit.org/2019_R3/_docs_IE_DG_inference_engine_intro.html).
* [pytorch](pytorch/): Demos of standalone and distributed [PyTorch](https://pytorch.org/). * [pytorch](pytorch/): Demos of standalone and distributed [PyTorch](https://pytorch.org/).
* [paddlepaddle](paddlepaddle/): A demo of [PaddlePaddle](https://www.paddlepaddle.org.cn/).
* [redis](redis/): A demo of [Redis](https://redis.io). * [redis](redis/): A demo of [Redis](https://redis.io).
* [sofaboot](sofaboot/): A demo of [SOFABoot](https://github.com/sofastack/sofa-boot), an open source Java development framework based on Spring Boot. * [sofaboot](sofaboot/): A demo of [SOFABoot](https://github.com/sofastack/sofa-boot), an open source Java development framework based on Spring Boot.
* [sqlite](sqlite/) A demo of [SQLite](https://www.sqlite.org) SQL database engine. * [sqlite](sqlite/) A demo of [SQLite](https://www.sqlite.org) SQL database engine.

3
demos/paddlepaddle/.gitignore vendored Normal file

@ -0,0 +1,3 @@
occlum_instance/
miniconda/
Miniconda3*

@ -0,0 +1,32 @@
# Use PaddlePaddle with Python and Occlum
This project demonstrates how Occlum enables _unmodified_
[PaddlePaddle](https://www.paddlepaddle.org.cn/) programs running in SGX
enclaves, on the basis of _unmodified_ [Python](https://www.python.org). The
workload is a primary AI task from [PaddlePaddle Quick
Start](https://www.paddlepaddle.org.cn/documentation/docs/zh/guides/beginner/quick_start_cn.html).
The source code of the workload resides in `demo.py`.
## How to Run
This tutorial is written under the assumption that you have Docker installed and use Occlum in a Docker container.
Occlum is compatible with glibc-supported Python, we employ miniconda as python installation tool. You can import paddle packages using conda. Here, miniconda is automatically installed by install_python_with_conda.sh script, the required python and paddle packages for this project are also loaded by this script.
Step 1 (on the host): Start an Occlum container
```
docker pull occlum/occlum:latest-ubuntu20.04
docker run -it --name=pythonDemo --device /dev/sgx/enclave occlum/occlum:latest-ubuntu20.04 bash
```
Step 2 (in the Occlum container): Download miniconda and install python to prefix position.
```
cd /root/demos/paddlepaddle
bash ./install_python_with_conda.sh
```
Step 3 (in the Occlum container): Run the sample code on Occlum
```
cd /root/demos/paddlepaddle
bash ./run_paddlepaddle_on_occlum.sh
```

@ -0,0 +1,31 @@
import paddle
import numpy as np
from paddle.vision.transforms import Normalize
transform = Normalize(mean=[127.5], std=[127.5], data_format='CHW')
train_dataset = paddle.vision.datasets.MNIST(image_path='mnist/train-images-idx3-ubyte.gz',
label_path='mnist/train-labels-idx1-ubyte.gz',
mode='train', transform=transform)
test_dataset = paddle.vision.datasets.MNIST(image_path='mnist/t10k-images-idx3-ubyte.gz',
label_path='mnist/t10k-labels-idx1-ubyte.gz',
mode='test', transform=transform)
lenet = paddle.vision.models.LeNet(num_classes=10)
model = paddle.Model(lenet)
model.prepare(paddle.optimizer.Adam(parameters=model.parameters()),
paddle.nn.CrossEntropyLoss(),
paddle.metric.Accuracy())
model.fit(train_dataset, epochs=5, batch_size=64, verbose=1)
model.evaluate(test_dataset, batch_size=64, verbose=1)
model.save('./output/mnist')
model.load('output/mnist')
img, label = test_dataset[0]
img_batch = np.expand_dims(img.astype('float32'), axis=0)
out = model.predict_batch(img_batch)[0]
pred_label = out.argmax()
print('true label: {}, pred label: {}'.format(label[0], pred_label))

@ -0,0 +1,36 @@
#!/bin/bash
set -e
script_dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
# 1. Init occlum workspace
[ -d occlum_instance ] || occlum new occlum_instance
# 2. Install python and dependencies to specified position
[ -f Miniconda3-latest-Linux-x86_64.sh ] || wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
[ -d miniconda ] || bash ./Miniconda3-latest-Linux-x86_64.sh -b -p $script_dir/miniconda
$script_dir/miniconda/bin/conda create --prefix $script_dir/python-occlum -y matplotlib numpy python=3.8.10 paddlepaddle==2.4.2 -c paddle
CORE_PY=$script_dir/python-occlum/lib/python3.8/site-packages/paddle/fluid/core.py
IMAGE_PY=$script_dir/python-occlum/lib/python3.8/site-packages/paddle/dataset/image.py
# Adjust the source code to run in Occlum
sed -i "186 i \ elif sysstr == 'occlum':\n return True" $CORE_PY
sed -ie "37,64d" $IMAGE_PY
sed -i "37 i \try:\n import cv2\nexcept ImportError:\n cv2 = None" $IMAGE_PY
# Download the dataset
DATASET=$script_dir/mnist
[ -d $DATASET ] && exit 0
TRAIN_IMAGE=train-images-idx3-ubyte.gz
TRAIN_LABEL=train-labels-idx1-ubyte.gz
TEST_IMAGE=t10k-images-idx3-ubyte.gz
TEST_LABEL=t10k-labels-idx1-ubyte.gz
URL=http://yann.lecun.com/exdb/mnist
mkdir $DATASET
wget $URL/$TRAIN_IMAGE -P $DATASET
wget $URL/$TRAIN_LABEL -P $DATASET
wget $URL/$TEST_IMAGE -P $DATASET
wget $URL/$TEST_LABEL -P $DATASET

@ -0,0 +1,19 @@
includes:
- base.yaml
targets:
- target: /bin
createlinks:
- src: /opt/python-occlum/bin/python3
linkname: python3
# python packages
- target: /opt
copy:
- dirs:
- ../python-occlum
# python code and dataset
- target: /
copy:
- files:
- ../demo.py
- dirs:
- ../mnist

@ -0,0 +1,27 @@
#!/bin/bash
set -e
BLUE='\033[1;34m'
NC='\033[0m'
script_dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python_dir="$script_dir/occlum_instance/image/opt/python-occlum"
cd occlum_instance && rm -rf image
copy_bom -f ../paddlepaddle.yaml --root image --include-dir /opt/occlum/etc/template
if [ ! -d $python_dir ];then
echo "Error: cannot stat '$python_dir' directory"
exit 1
fi
new_json="$(jq '.resource_limits.user_space_size = "6000MB" |
.resource_limits.kernel_space_heap_size = "256MB" |
.resource_limits.max_num_of_threads = 64 |
.env.default += ["PYTHONHOME=/opt/python-occlum"]' Occlum.json)" && \
echo "${new_json}" > Occlum.json
occlum build
# Run the python demo
echo -e "${BLUE}occlum run /bin/python3 demo.py${NC}"
occlum run /bin/python3 demo.py