occlum/demos/cluster_serving
2021-09-26 21:06:59 +08:00
..
cluster_serving.yaml Update cluster serving demo with copy_bom 2021-09-26 21:06:59 +08:00
config.yaml Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00
environment.sh Fix Cluster Serving demo classpath 2021-08-16 21:04:22 +08:00
hosts Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00
init-occlum-taskmanager.sh Update cluster serving demo with copy_bom 2021-09-26 21:06:59 +08:00
install-dependencies.sh Fix Cluster Serving demo classpath 2021-08-16 21:04:22 +08:00
push-image.sh Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00
README.md Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00
start-all.sh Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00
start-cluster-serving-job.sh Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00
start-flink-jobmanager.sh Fix Cluster Serving demo classpath 2021-08-16 21:04:22 +08:00
start-flink-taskmanager.sh Fix Cluster Serving demo classpath 2021-08-16 21:04:22 +08:00
start-http-frontend.sh Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00
start-redis.sh Fix Cluster Serving demo classpath 2021-08-16 21:04:22 +08:00
stop-all.sh Add analytics zoo cluster serving demo 2021-06-17 17:14:40 +08:00

Analytics Zoo Cluster Serving Inference in SGX with Occlum

This example demonstrates how to use Analytics Zoo Cluster Serving for real-time inference in SGX. Analytics Zoo is an open source Big Data AI platform, Cluster Serving is a real-time serving solution that enables automatic model inference on Flink cluster.

Note that in this example all components are run on single machine within one container. For running cluster serving with SGX on multi-nodes, please refer to distributed mode guide from Analytics Zoo.

Besides following steps in this demo, user can also choose to directly use the docker image provided by Analytics Zoo for cluster serving with Occlum which has all dependencies pre-installed. For detailed guide using the docker image, please refer to Analytics Zoo guide.

Set up environment

Set environment variables and install dependencies (Redis, Flink, Analytics Zoo, models)

source ./environment.sh
./install-dependencies.sh

Start Cluster Serving

Start Redis, Flink and cluster serving

./start-all.sh

Or you can start components separately:

  1. Start Redis Server

    ./start-redis.sh &

  2. Start Flink

    Start Flink Jobmanager on host

    ./start-flink-jobmanager.sh

    Initialize and start Flink Taskmanager with Occlum

    ./init-occlum-taskmanager.sh  			
    ./start-flink-taskmanager.sh
    
  3. Start Cluster Serving job

    Start HTTP frontend

    ./start-http-frontend.sh &

    Start cluster serving job

    ./start-cluster-serving-job.sh

Push inference image

Push image into queue via Restful API for inference. Users can modify the script with base64 of inference image (note that the image size must match model input size, e.g. 224*224 for resnet50 in this demo). Users can also use python API to directly push the image file, see guide for details.

./push-image.sh

Stop Cluster Serving

Stop cluster serving job and all components

./stop-all.sh