From 192a1a2f02c3dd728b1ae935c25be2c438548fba Mon Sep 17 00:00:00 2001 From: Qi Zheng Date: Wed, 18 Oct 2023 11:28:39 +0800 Subject: [PATCH] [demos] Update readme of demo llm --- demos/bigdl-llm/README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/demos/bigdl-llm/README.md b/demos/bigdl-llm/README.md index 84cd3fba..cb677ebd 100644 --- a/demos/bigdl-llm/README.md +++ b/demos/bigdl-llm/README.md @@ -76,9 +76,9 @@ In order to load models using BigDL-LLM, the model name should include "bigdl". To serve using the Web UI, you need three main components: web servers that interface with users, model workers that host one or more models, and a controller to coordinate the web server and model workers. -#### Launch the Controller +#### Launch the Controller in non-TEE env ```bash -./python-occlum/bin/python -m fastchat.serve.controller +./python-occlum/bin/python -m fastchat.serve.controller --host 0.0.0.0 ``` This controller manages the distributed workers.