[example] Update docker image version existed in docker hub
This commit is contained in:
parent
0f789b49bc
commit
e54a3d1844
@ -55,9 +55,9 @@ Now users could send inference request with server certificates (`server.crt`).
|
|||||||
|
|
||||||
There are prebuilt docker images could be used for the examples, either in the following docker way or [`kubernates`](./kubernetes/) way. Users could pull them directly and try the example.
|
There are prebuilt docker images could be used for the examples, either in the following docker way or [`kubernates`](./kubernetes/) way. Users could pull them directly and try the example.
|
||||||
```
|
```
|
||||||
docker pull occlum/init_ra_server:0.29.1
|
docker pull occlum/init_ra_server:0.29.2-ubuntu20.04
|
||||||
docker pull occlum/tf_demo:0.29.1
|
docker pull occlum/tf_demo:0.29.2-ubuntu20.04
|
||||||
docker pull occlum/tf_demo_client:0.29.1
|
docker pull occlum/tf_demo_client:0.29.2-ubuntu20.04
|
||||||
```
|
```
|
||||||
|
|
||||||
If users want to build or customize the images, please check below part.
|
If users want to build or customize the images, please check below part.
|
||||||
@ -66,11 +66,11 @@ If users want to build or customize the images, please check below part.
|
|||||||
|
|
||||||
Our target is to deploy the demo in separated container images, so docker build is necessary steps. Thanks to the `docker run in docker` method, this example build could be done in Occlum development container image.
|
Our target is to deploy the demo in separated container images, so docker build is necessary steps. Thanks to the `docker run in docker` method, this example build could be done in Occlum development container image.
|
||||||
|
|
||||||
First, please make sure `docker` is installed successfully in your host. Then start the Occlum container (use version `0.29.1-ubuntu20.04` for example) as below.
|
First, please make sure `docker` is installed successfully in your host. Then start the Occlum container (use version `0.29.2-ubuntu20.04` for example) as below.
|
||||||
```
|
```
|
||||||
$ sudo docker run --rm -itd --network host \
|
$ sudo docker run --rm -itd --network host \
|
||||||
-v $(which docker):/usr/bin/docker -v /var/run/docker.sock:/var/run/docker.sock \
|
-v $(which docker):/usr/bin/docker -v /var/run/docker.sock:/var/run/docker.sock \
|
||||||
occlum/occlum:0.29.1-ubuntu20.04
|
occlum/occlum:0.29.2-ubuntu20.04
|
||||||
```
|
```
|
||||||
|
|
||||||
All the following are running in the above container.
|
All the following are running in the above container.
|
||||||
|
@ -56,12 +56,12 @@ usage: build.sh [OPTION]...
|
|||||||
|
|
||||||
For example, below command generates three container images.
|
For example, below command generates three container images.
|
||||||
```
|
```
|
||||||
# ./build.sh -r demo -g 0.29.1
|
# ./build.sh -r demo -g 0.29.2
|
||||||
```
|
```
|
||||||
|
|
||||||
* **`demo/init_ra_server:0.29.1`** acts as key broker pod.
|
* **`demo/init_ra_server:0.29.2`** acts as key broker pod.
|
||||||
* **`demo/tf_demo:0.29.1`** acts as tensorflow serving pod.
|
* **`demo/tf_demo:0.29.2`** acts as tensorflow serving pod.
|
||||||
* **`demo/tf_demo_client:0.29.1`** acts as client.
|
* **`demo/tf_demo_client:0.29.2`** acts as client.
|
||||||
|
|
||||||
## How to test
|
## How to test
|
||||||
|
|
||||||
@ -110,7 +110,7 @@ In default, only one replica for the tensorflow serving pod.
|
|||||||
### Try the inference request
|
### Try the inference request
|
||||||
|
|
||||||
```
|
```
|
||||||
$ docker run --rm --network host demo/tf_demo_client:0.29.1 python3 inception_client.py --server=localhost:31001 --crt server.crt --image cat.jpg
|
$ docker run --rm --network host demo/tf_demo_client:0.29.2 python3 inception_client.py --server=localhost:31001 --crt server.crt --image cat.jpg
|
||||||
```
|
```
|
||||||
|
|
||||||
If successful, it prints the classification results.
|
If successful, it prints the classification results.
|
||||||
@ -120,7 +120,7 @@ If successful, it prints the classification results.
|
|||||||
Below command can do benchmark test for the tensorflow serving service running in Occlum.
|
Below command can do benchmark test for the tensorflow serving service running in Occlum.
|
||||||
|
|
||||||
```
|
```
|
||||||
$ docker run --rm --network host demo/tf_demo_client:0.29.1 python3 benchmark.py --server localhost:31001 --crt server.crt --cnum 4 --loop 10 --image cat.jpg
|
$ docker run --rm --network host demo/tf_demo_client:0.29.2 python3 benchmark.py --server localhost:31001 --crt server.crt --cnum 4 --loop 10 --image cat.jpg
|
||||||
```
|
```
|
||||||
|
|
||||||
Try scale up the tensorflow serving pods number, better `tps` can be achieved.
|
Try scale up the tensorflow serving pods number, better `tps` can be achieved.
|
||||||
|
Loading…
Reference in New Issue
Block a user