llm_model_dict 处理了loader的一些预设行为,如加载位置,模型名称,模型处理器实例, 定义checkpoint名称和远程路径
loader.py: 模型重载 定义 generatorAnswer 增加 AnswerResultStream 定义generate_with_callback收集器,在每次响应时将队列数据同步到AnswerResult requirements.txt 变更项目依赖
正在显示
fastchat/__init__.py
0 → 100644
fastchat/api/__init__.py
0 → 100644
fastchat/api/conversation.py
0 → 100644
fastchat/api/fastchat_api.py
0 → 100644
models/__main__.py
0 → 100644
models/base.py
0 → 100644
models/extensions/callback.py
0 → 100644
models/extensions/extensions.py
0 → 100644
models/extensions/thread_with_exception.py
0 → 100644
models/llama_llm.py
0 → 100644
models/loader/__init__.py
0 → 100644
models/loader/args.py
0 → 100644
models/loader/loader.py
0 → 100644
models/shared.py
0 → 100644
... | @@ -17,6 +17,8 @@ fastapi | ... | @@ -17,6 +17,8 @@ fastapi |
uvicorn | uvicorn | ||
peft | peft | ||
pypinyin | pypinyin | ||
bitsandbytes | |||
click~=8.1.3 | click~=8.1.3 | ||
tabulate | tabulate | ||
bitsandbytes; platform_system != "Windows" | |||
llama-cpp-python==0.1.34; platform_system != "Windows" | |||
https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.34/llama_cpp_python-0.1.34-cp310-cp310-win_amd64.whl; platform_system == "Windows" |
请
注册
或者
登录
后发表评论