ChatPilot
0.1.2
??中文| English
ChatPilot : Chat Agent WebUI, 實現了AgentChat對話,支援Google搜尋、文件網址對話(RAG)、代碼解釋器功能,復現Kimi Chat(文件,拖進來;網址,發出來),支援OpenAI/Azure API。
Official Demo: https://chat.mulanai.com
export OPENAI_API_KEY=sk-xxx
export OPENAI_BASE_URL=https://xxx/v1
docker run -it
-e OPENAI_API_KEY= $WORKSPACE_BASE
-e OPENAI_BASE_URL= $OPENAI_BASE_URL
-e RAG_EMBEDDING_MODEL= " text-embedding-ada-002 "
-p 8080:8080 --name chatpilot- $( date +%Y%m%d%H%M%S ) shibing624/chatpilot:0.0.1
You'll find ChatPilot running at http://0.0.0.0:8080 Enjoy! ?
git clone https://github.com/shibing624/ChatPilot.git
cd ChatPilot
pip install -r requirements.txt
# Copying required .env file, and fill in the LLM api key
cp .env.example .env
bash start.sh
好了,現在你的應用程式正在運行:http://0.0.0.0:8080 Enjoy! ?
兩種方法建構前端:
git clone https://github.com/shibing624/ChatPilot.git
cd ChatPilot/
# Building Frontend Using Node.js >= 20.10
cd web
npm install
npm run build
輸出:專案web
目錄產出build
資料夾,包含了前端編譯輸出檔。
export OPENAI_API_KEY=xxx
export OPENAI_BASE_URL=https://api.openai.com/v1
export MODEL_TYPE= " openai "
export AZURE_OPENAI_API_KEY=
export AZURE_OPENAI_API_VERSION=
export AZURE_OPENAI_ENDPOINT=
export MODEL_TYPE= " azure "
以ollama serve
啟動ollama服務,然後設定OLLAMA_API_URL
: export OLLAMA_API_URL=http://localhost:11413
litellm
套件: pip install litellm -U
chatpilot
預設的litellm config檔在~/.cache/chatpilot/data/litellm/config.yaml
修改其內容如下:
model_list :
# - model_name: moonshot-v1-auto # show model name in the UI
# litellm_params: # all params accepted by litellm.completion() - https://docs.litellm.ai/docs/completion/input
# model: openai/moonshot-v1-auto # MODEL NAME sent to `litellm.completion()` #
# api_base: https://api.moonshot.cn/v1
# api_key: sk-xx
# rpm: 500 # [OPTIONAL] Rate limit for this deployment: in requests per minute (rpm)
- model_name : deepseek-ai/DeepSeek-Coder # show model name in the UI
litellm_params : # all params accepted by litellm.completion() - https://docs.litellm.ai/docs/completion/input
model : openai/deepseek-coder # MODEL NAME sent to `litellm.completion()` #
api_base : https://api.deepseek.com/v1
api_key : sk-xx
rpm : 500
- model_name : openai/o1-mini # show model name in the UI
litellm_params : # all params accepted by litellm.completion() - https://docs.litellm.ai/docs/completion/input
model : o1-mini # MODEL NAME sent to `litellm.completion()` #
api_base : https://api.61798.cn/v1
api_key : sk-xxx
rpm : 500
litellm_settings : # module level litellm settings - https://github.com/BerriAI/litellm/blob/main/litellm/__init__.py
drop_params : True
set_verbose : False
如果你在研究中使用了ChatPilot,請按以下格式引用:
APA:
Xu, M. ChatPilot: LLM agent toolkit (Version 0.0.2) [Computer software]. https://github.com/shibing624/ChatPilot
BibTeX:
@misc{ChatPilot,
author = {Ming Xu},
title = {ChatPilot: llm agent},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = { url {https://github.com/shibing624/ChatPilot}},
}
授權協議為The Apache License 2.0,可免費用做商業用途。請在產品說明中附加ChatPilot的連結和授權協議。
項目代碼還很粗糙,如果大家對程式碼有所改進,歡迎提交回本項目,在提交之前,請注意以下兩點:
tests
中加入對應的單元測試python -m pytest -v
來運行所有單元測試,確保所有單測都是通過的之後即可提交PR。