Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
ychao_1983 b758894b7b | 3 months ago | |
---|---|---|
media | 5 months ago | |
.openi-aitemplate | 4 months ago | |
README.md | 5 months ago | |
baichuan_fastapi.py | 3 months ago | |
client.py | 5 months ago | |
llama3.1_fastapi.py | 4 months ago | |
llama_common_fastapi.py | 3 months ago | |
llama_fastapi.py | 5 months ago | |
pcmind_fastapi.py | 5 months ago | |
qwen_fastapi.py | 3 months ago | |
yi_fastapi.py | 5 months ago |
此项目提供部署大模型推理服务的脚本,用于平台大模型在线体验。用户可根据此仓库的脚本与说明,使用云脑-在线推理部署对应的LLM API服务,用户自行部署的服务不包括对话前端界面。
脚本将在启智云脑任务中启动FastAPI服务,大模型对话前端界面由平台提供。在启智AI协作平台上,在线体验的路径:模型->大模型基地->自然语言处理模型->模型卡片->在线体验
调试任务内部署服务仅可在环境内测试,无法对外部提供api服务。对外服务请参考下列的 三.在线推理
使用本说明上方的 测试LLM API服务 云脑任务模板, 一键启动调试任务
进入调试环境后,新建 notebook,运行下列代码,初始化调试环境,解压代码以及模型文件:
from c2net.context import prepare
c2net_context = prepare()
刷新右侧文件导航栏,若能看见 openi_llm_example 文件夹,表示环境初始化成功。进入 openi_llm_example 目录并修改脚本(llama_fastapi.py
或 yi_fastapi.py
)
将脚本最后一行代码中的服务端口,port参数,改为8000(8888端口将与jupyternotebook冲突):
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)
新建 Terminal 终端运行脚本,在调试环境内本地部署llm api服务:
cd /tmp/code/openi_llm_example
python llama_fastapi.py 或 python yi_fastapi.py
待服务完全启动后,再次新建一个 Terminal 终端,进入 openi_llm_example 目录,运行 client.py
脚本,若得到模型流式输出,则测试部署成功:
cd /tmp/code/openi_llm_example
python client.py
示例输出:
prompt: 我的名字是什么
response: 从之前的对话中得知,你叫乔治
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》