(2023 年 11 月 27 日更新)该存储库的最初目标是比较一些可以在消费类硬件上运行的较小模型(7B 和 13B),以便每个模型都有 GPT-4 中一组问题的分数。但我意识到,随着更多有能力的模型出现,评估和比较过程可能还不够。
因此,我只为较新的模型提供 Colab WebUI 链接,您只需点击几下即可自行尝试 - 毕竟,语言模型的有效性在很大程度上取决于它对您的特定用例的适用性。通过亲自尝试这些模型,您可以评估其性能并确定哪一个最适合您的需求。
这些模型可以在消费类硬件上运行,并且通常都很好(来自 Reddit 的建议和我自己的经验)。亲自尝试一下(点击“在 Colab 中打开”按钮)!
模型 | 关联 | 科拉布链接 | 添加日期 | 笔记 |
---|---|---|---|---|
zephyr-7B-β-GGUF | https://huggingface.co/TheBloke/zephyr-7B-beta-GGUF | 2023/11/27 | 角色扮演还可以,没有被审查 | |
OpenHermes-2.5-Mistral-7B-GGUF | https://huggingface.co/TheBloke/OpenHermes-2.5-Mistral-7B-GGUF | 2023/11/27 | 角色扮演很好,没有审查 | |
海豚-2.2.1-mistral-7B-GGUF | https://huggingface.co/TheBloke/dolphin-2.2.1-mistral-7B-GGUF | 2023/11/27 | 角色扮演还可以,没有被审查 | |
神经聊天-7B-v3-1-GGUF | https://huggingface.co/TheBloke/neural-chat-7B-v3-1-GGUF | 2023/11/27 | 角色扮演还可以,没有审查;一些逻辑缺陷 | |
openchat_3.5-16k-GGUF | https://huggingface.co/TheBloke/openchat_3.5-16k-GGUF | 2023/11/27 | 审查 | |
椋鸟-LM-7B-α-GGUF | https://huggingface.co/TheBloke/Starling-LM-7B-alpha-GGUF | 2023/11/29 | 审查;在 Reddit 上评价很高 | |
Orca-2-7B-GGUF | https://huggingface.co/TheBloke/Orca-2-7B-GGUF | 2023/11/29 | 审查 | |
Orca-2-13B-GGUF | https://huggingface.co/TheBloke/Orca-2-13B-GGUF | 2023/11/29 | 经过审查,一些奇怪的逻辑缺陷比 7B 版本更糟糕 | |
神话之雾-7B-GGUF | https://huggingface.co/TheBloke/MythoMist-7B-GGUF | 2023/11/29 | 角色扮演还可以,没有审查;一些逻辑缺陷 | |
NeuralHermes-2.5-Mistral-7B-GGUF | https://huggingface.co/TheBloke/NeuralHermes-2.5-Mistral-7B-GGUF | 2023/12/05 | 角色扮演很好,没有审查 | |
稳定-zephyr-3b-GGUF | https://huggingface.co/TheBloke/stablelm-zephyr-3b-GGUF | 2023/12/11 | 3B;角色扮演还可以;未经审查;一些逻辑缺陷 | |
deepseek-llm-7B-聊天-GGUF | https://huggingface.co/TheBloke/deepseek-llm-7B-chat-GGUF | 2023/12/11 | 审查 | |
Mistral-7B-指令-v0.2-GGUF | https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-GGUF | 2023/12/13 | 部分审查;角色扮演还可以;在 Reddit 上评价很高 | |
Mixtral-8x7B-Instruct-v0.1-GGUF | https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF | 2023/12/13 | 教育部模型;部分审查;角色扮演还可以 | |
Deepsex-34b-GGUF | https://huggingface.co/TheBloke/deepsex-34b-GGUF | 2023/12/14 | 34B; NSFW模型 | |
phi-2 | https://huggingface.co/microsoft/phi-2 | 2023/12/15 | 2.7B;基础模型; | |
Xwin-MLewd-13B-v0.2-GGUF | https://huggingface.co/TheBloke/Xwin-MLewd-13B-v0.2-GGUF | 2023/12/15 | 13B; NSFW模型 | |
MythoMax-L2-13B-GGUF | https://huggingface.co/TheBloke/MythoMax-L2-13B-GGUF | 2023/12/15 | 13B;审查;角色扮演还可以 | |
LLaMA2-13B-Tiefighter-GGUF | https://huggingface.co/TheBloke/LLaMA2-13B-Tiefighter-GGUF | 2023/12/15 | 13B;角色扮演不错 | |
LLaMA2-13B-Psyfighter2-GGUF | https://huggingface.co/TheBloke/LLaMA2-13B-Psyfighter2-GGUF | 2023/12/15 | 13B;部分审查;角色扮演还可以; Reddit 上推荐 | |
Noromaid-13B-v0.1.1-GGUF | https://huggingface.co/TheBloke/Noromaid-13B-v0.1.1-GGUF | 2023/12/15 | 13B; NSFW 模型;角色扮演好 | |
海豚-2.5-mixtral-8x7b-GGUF | https://huggingface.co/TheBloke/dolphin-2.5-mixtral-8x7b-GGUF | 2023/12/20 | 教育部模型;声称未经审查但拒绝某些请求;角色扮演不可用 | |
SOLAR-10.7B-指令-v1.0-GGUF | https://huggingface.co/TheBloke/SOLAR-10.7B-Instruct-v1.0-GGUF | 2023/12/21 | 10.7B;审查;角色扮演还可以 | |
Nous-Hermes-2-SOLAR-10.7B-GGUF | https://huggingface.co/TheBloke/Nous-Hermes-2-SOLAR-10.7B-GGUF | 2024/01/08 | 10.7B;部分审查;角色扮演不错 | |
openchat-3.5-0106-GGUF | https://huggingface.co/TheBloke/openchat-3.5-0106-GGUF | 2024/01/12 | 角色扮演和创意写作良好;没有审查 | |
Mistral-7B-Instruct-v0.2-code-ft-GGUF | https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-code-ft-GGUF | 2024/01/12 | 编码模型 | |
熊猫-7B-v0.1-GGUF | https://huggingface.co/TheBloke/Panda-7B-v0.1-GGUF | 2024/01/12 | 角色扮演还可以;创意写作好;部分审查 | |
LLaMA-Pro-8B-GGUF | https://huggingface.co/TheBloke/LLaMA-Pro-8B-GGUF | 2024/01/16 | 声称擅长数学/编码;质量一般般 | |
海豚-2.6-mistral-7B-dpo-激光-GGUF | https://huggingface.co/TheBloke/dolphin-2.6-mistral-7B-dpo-laser-GGUF | 2024/01/16 | 未经审查;品质好;角色扮演不错 | |
激光超GGUF | https://huggingface.co/dagbs/laserxtral-GGUF | 2024/01/17 | 4x7B MOE 模型;内存比 Mixtral 低一半;未经审查;角色扮演不错 | |
Mixtral_7Bx2_MoE-GGUF | https://huggingface.co/TheBloke/Mixtral_7Bx2_MoE-GGUF | 2024/01/23 | 2x7B MOE 模型;未经审查;角色扮演还可以 | |
水豚Hermes-2.5-Mistral-7B-GGUF | https://huggingface.co/TheBloke/CapybaraHermes-2.5-Mistral-7B-GGUF | 2024/02/06 | 审查; | |
稳定代码-3b-GGUF | https://huggingface.co/TheBloke/stable-code-3b-GGUF/ | 2024/02/06 | 3B;编码模型;质量可能太小 | |
EstopianMaid-13B-GGUF | https://huggingface.co/TheBloke/EstopianMaid-13B-GGUF | 2024/02/06 | 13B;角色扮演模型 | |
gemma-7b-it-GGUF | https://huggingface.co/mlabonne/gemma-7b-it-GGUF/ | 2024/02/28 | 7B;审查 | |
StarCoder2-15B-GGUF | https://huggingface.co/second-state/StarCoder2-15B-GGUF | 2024/03/20 | 15B;编码模型;对 r/LocalLLaMA 投了高票 | |
Qwen1.5-7B-聊天-GGUF | https://huggingface.co/Qwen/Qwen1.5-7B-Chat-GGUF | 2024/03/20 | 7B;审查;对 r/LocalLLaMA 投了高票 | |
Qwen1.5-14B-聊天-GGUF | https://huggingface.co/Qwen/Qwen1.5-14B-Chat-GGUF | 2024/03/20 | 14B;审查;对 r/LocalLLaMA 投了高票 | |
Hermes-2-Pro-Mistral-7B-GGUF | https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B-GGUF | 2024/03/22 | 7B;未经审查;角色扮演还可以;在 Huggingface 上排名很高 | |
Nous-Hermes-2-Mistral-7B-DPO-GGUF | https://huggingface.co/NousResearch/Nous-Hermes-2-Mistral-7B-DPO-GGUF | 2024/03/22 | 7B;部分审查;角色扮演很好;在 Huggingface 上排名很高 | |
稳定代码指令 3b-GGUF | https://huggingface.co/bartowski/stable-code-instruct-3b-GGUF | 2024/03/27 | 3B;指令调整代码生成模型 | |
Qwen1.5-MoE-A2.7B-聊天-GPTQ-Int4 | https://huggingface.co/Qwen/Qwen1.5-MoE-A2.7B-Chat-GPTQ-Int4 | 2024/04/03 | 教育部;占地面积小;一些逻辑错误 | |
章鱼-v2 | https://huggingface.co/NexaAIDev/Octopus-v2/ | 2024/04/07 | 2B;非量化;针对设备上的 Android API 进行了优化 | |
codegemma-7b-it-GGUF | https://huggingface.co/lmstudio-community/codegemma-7b-it-GGUF | 2024/04/18 | 7B;编码模型 | |
代码Qwen1.5-7B-聊天-GGUF | https://huggingface.co/Qwen/CodeQwen1.5-7B-Chat-GGUF | 2024/04/18 | 7B;编码模型 | |
向导LM-2-7B-GGUF | https://huggingface.co/MaziyarPanahi/WizardLM-2-7B-GGUF | 2024/04/18 | 7B;审查 | |
Meta-Llama-3-8B-指令-GGUF | https://huggingface.co/QuantFactory/Meta-Llama-3-8B-Instruct-GGUF | 2024/04/19 | 8B;审查 | |
海豚-2.9-llama3-8b-gguff | https://huggingface.co/cognitivecomputations/dolphin-2.9-llama3-8b-gguff | 2024/04/22 | 8B;未经审查; Llama-3-8B 的逻辑似乎已退化 | |
Lexi-Llama-3-8B-未经审查-GGUF | https://huggingface.co/Orenguteng/Lexi-Llama-3-8B-Uncensored-GGUF | 2024/04/24 | 8B;未经审查 | |
Llama3-8B-中文聊天-GGUF | https://huggingface.co/QuantFactory/Llama3-8B-Chinese-Chat-GGUF | 2024/04/24 | 8B;中国人 | |
Phi-3-mini-4k-指令-gguf | https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-gguf | 2024/04/24 | 3.8B;审查;快速地 | |
Llama-3-8B-Instruct-32k-v0.1-GGUF | https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-32k-v0.1-GGUF | 2024/04/25 | 8B; 32K上下文;适合长文本的摘要 | |
starcoder2-15b-instruct-v0.1-GGUF | https://huggingface.co/bartowski/starcoder2-15b-instruct-v0.1-GGUF | 2024/05/06 | 15B;编码模型 | |
Hermes-2-Pro-Llama-3-8B-GGUF | https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF | 2024/05/06 | 8B;部分审查; JSON、工具使用等 | |
Llama-3-ChatQA-1.5-8B-GGUF | https://huggingface.co/bartowski/Llama-3-ChatQA-1.5-8B-GGUFF | 2024/05/15 | 8B;未经审查 | |
Hermes-2-Theta-Llama-3-8B-GGUF | https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B-GGUF | 2024/05/17 | 8B;审查; JSON、工具使用等 | |
开花-v5.1-9b-GGUF | https://huggingface.co/bartowski/blossom-v5.1-9b-GGUF | 2024/05/17 | 9B; Wizard/Orca/Math/中文/英文混合数据集 | |
猎鹰-11B-GGUF | https://huggingface.co/bartowski/falcon-11B-GGUF | 2024/05/17 | 11B;基础原始预训练模型 | |
llama-3-cat-8b-指令-v1-GGUFF | https://huggingface.co/bartowski/llama-3-cat-8b-instruct-v1-GGUFF | 2024/05/20 | 8B;部分审查;系统指令保真度;角色扮演 | |
Yi-1.5-9B-聊天-GGUF | https://huggingface.co/bartowski/Yi-1.5-9B-Chat-GGUF | 2024/05/20 | 9B;审查 | |
SFR-迭代-DPO-LLaMA-3-8B-R-GGUF | https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF | 2024/05/22 | 8B;部分审查 | |
Llama-3-Lumimaid-8B-v0.1-OAS-GGUF-IQ-Imatrix | https://huggingface.co/Lewdiculous/Llama-3-Lumimaid-8B-v0.1-OAS-GGUF-IQ-Imatrix | 2024/05/22 | 8B;角色扮演;接受过不拒绝请求的培训;似乎接受过 Quora 数据的训练 | |
Mistral-7B-指令-v0.3-GGUF | https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-v0.3-GGUF | 2024/05/23 | 7B;审查 | |
L3-8B-Stheno-v3.1-GGUF-IQ-Imatrix | https://huggingface.co/Lewdiculous/L3-8B-Stheno-v3.1-GGUF-IQ-Imatrix | 2024/05/30 | 8B;接受过一对一未经审查的角色扮演训练 | |
绫-23-8B-GGUF | https://huggingface.co/bartowski/aya-23-8B-GGUF | 2024/05/30 | 8B;审查 | |
LLaMA3-迭代-DPO-最终-GGUF | https://huggingface.co/bartowski/LLaMA3-iterative-DPO-final-GGUF | 2024/05/30 | 8B;审查 | |
openchat-3.6-8b-20240522-GGUF | https://huggingface.co/bartowski/openchat-3.6-8b-20240522-GGUF | 2024/06/04 | 8B;部分审查 | |
Meta-Llama-3-8B-Instruct-abliterated-v3-GGUF | https://huggingface.co/failspy/Meta-Llama-3-8B-Instruct-abliterated-v3-GGUF | 2024/06/04 | 8B;未经审查; | |
NeuralDaredevil-8B-删减-GGUF | https://huggingface.co/QuantFactory/NeuralDaredevil-8B-abliterated-GGUF | 2024/06/19 | 8B;未经审查; | |
Qwen2-7B-指令-GGUF | https://huggingface.co/Qwen/Qwen2-7B-Instruct-GGUF | 2024/06/24 | 7B;审查 | |
DeepSeek-Coder-V2-Lite-Instruct-GGUF | https://huggingface.co/lmstudio-community/DeepSeek-Coder-V2-Lite-Instruct-GGUF | 2024/06/27 | 16B;指导编码模型 | |
internlm2_5-7b-聊天-gguf | https://huggingface.co/internlm/internlm2_5-7b-chat-gguf | 2024/07/11 | 7B;审查;长上下文;推理和工具使用; | |
gemma-2-9b-it-GGUF | https://huggingface.co/bartowski/gemma-2-9b-it-GGUF | 2024/07/11 | 9B;审查 | |
包皮垢-豪华-9B-v1-GGUF | https://huggingface.co/bartowski/Smegmma-Deluxe-9B-v1-GGUF | 2024/07/22 | 9B; Gemma 审查较少(仍然拒绝一些请求);角色扮演 | |
h2o-danube3-4b-聊天-GGUF | https://huggingface.co/h2oai/h2o-danube3-4b-chat-GGUF | 2024/07/17 | 4B;部分审查 | |
Tiger-Gemma-9B-v1-GGUF | https://huggingface.co/bartowski/Tiger-Gemma-9B-v1-GGUF | 2024/07/17 | 9B;未经审查;但似乎在逻辑上有一些缺陷 | |
Gemmasutra-9B-v1-GGUF | https://huggingface.co/TheDrummer/Gemmasutra-9B-v1-GGUF | 2024/07/24 | 9B;有一定审查的角色扮演 | |
Meta-Llama-3.1-8B-指令-GGUF | https://huggingface.co/lmstudio-community/Meta-Llama-3.1-8B-Instruct-GGUF | 2024/07/25 | 8B;审查 | |
米斯特拉尔-尼莫-指令-2407-GGUF | https://huggingface.co/second-state/Mistral-Nemo-Instruct-2407-GGUF | 2024/07/25 | 12.2B;部分未经审查; Reddit 用户推荐 | |
Celeste-12B-V1.6_iMatrix_GGUF | https://huggingface.co/MarsupialAI/Celeste-12B-V1.6_iMatrix_GGUF | 2024/07/30 | 12B;角色扮演和故事写作模型,未经审查 | |
Hermes-3-Llama-3.1-8B-GGUF | https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B-GGUF | 2024/08/19 | 8B;未经审查 | |
Gemma-2-9B-It-SPPO-Iter3-GGUF | https://huggingface.co/bartowski/Gemma-2-9B-It-SPPO-Iter3-GGUF | 2024/08/19 | 9B;推荐在r/LocalLLaMA上进行总结 | |
骆马-3.1-Storm-8B-GGUF | https://huggingface.co/bartowski/Llama-3.1-Storm-8B-GGUF | 2024/08/20 | 8B;审查 | |
Phi-3.5-迷你指令-GGUF | https://huggingface.co/lmstudio-community/Phi-3.5-mini-instruct-GGUF | 2024/08/21 | 3.5B;经过审查、小而快 | |
Phi-3.5-迷你指令_未经审查-GGUF | https://huggingface.co/bartowski/Phi-3.5-mini-instruct_Uncensored-GGUF | 2024/08/26 | 3.5B;未经审查、小而快,但存在逻辑缺陷和结果不一致 | |
NemoMix-Unleashed-12B-GGUF | https://huggingface.co/bartowski/NemoMix-Unleashed-12B-GGUF | 2024/08/26 | 12B;部分审查、角色扮演和讲故事 | |
米斯特拉尔-NeMo-Minitron-8B-Base-GGUF | https://huggingface.co/QuantFactory/Mistral-NeMo-Minitron-8B-Base-GGUF | 2024/08/26 | 8B;基础型号 | |
Yi-Coder-9B-聊天-GGUF | https://huggingface.co/bartowski/Yi-Coder-9B-Chat-GGUF | 2024/09/05 | 9B;指令编码模型 | |
Llama-3.2-3B-指令-Q8_0-GGUF | https://huggingface.co/hugging-quants/Llama-3.2-3B-Instruct-Q8_0-GGUF | 2024/10/01 | 3B;小审查模型 | |
Qwen2.5-7B-指令-GGUF | https://huggingface.co/bartowski/Qwen2.5-7B-Instruct-GGUF | 2024/10/01 | 7B;已审查,推荐在 r/LocalLLaMA 上 | |
Qwen2.5-Coder-7B-指令-GGUF | https://huggingface.co/bartowski/Qwen2.5-Coder-7B-Instruct-GGUF | 2024/10/01 | 7B;指令编码模型,推荐在r/LocalLLaMA上 | |
Llama-3.1-8B-Lexi-未经审查-V2-GGUF | https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2-GGUF | 2024/10/15 | 8B;部分审查 | |
超新星-中型-GGUF | https://huggingface.co/arcee-ai/SuperNova-Medius-GGUF | 2024/10/15 | 14B;已审查,推荐在 r/LocalLLaMA 上 | |
Llama-3.2-3B-指示-未经审查-GGUF | https://huggingface.co/mradermacher/Llama-3.2-3B-Instruct-uncensored-GGUF | 2024/10/15 | 3B;小,未经审查 | |
Ministral-8B-Instruct-2410-HF-GGUF-测试 | https://huggingface.co/bartowski/Ministral-8B-Instruct-2410-HF-GGUF-TEST | 2024/10/21 | 8B;部分审查 | |
花岗岩-3.0-8b-指令-GGUF | https://huggingface.co/bartowski/granite-3.0-8b-instruct-GGUF | 2024/10/28 | 8B;审查 | |
aya-expanse-8b-GGUF | https://huggingface.co/bartowski/aya-expanse-8b-GGUF | 2024/10/28 | 8B;审查 | |
Qwen2.5-Coder-14B-指令-GGUF | https://huggingface.co/Qwen/Qwen2.5-Coder-14B-Instruct-GGUF | 2024/11/12 | 14B;编码模型,擅长其尺寸 |
这些模型在我在我的硬件(i5-12490F、32GB RAM、RTX 3060 Ti GDDR6X 8GB VRAM)上测试的模型中表现更好:(注:因为 llama.cpp 对旧 ggml 模型的支持做了一些重大更改。一些旧模型下面列出的 ggml 版本可能无法在当前的 llama.cpp 上正常工作,但应该有适用于模型的 GPTQ 等效版本或更新的 ggml 版本。)
笔记:
型号_名称 | 平均分 | Colab_Link | 添加日期 | 关联 |
---|---|---|---|---|
Mistral-7B-OpenOrca(使用 oobabooga/text- Generation-webui) | 10:00 | 2023/10/08 | https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GGUF | |
Llama-2-13B-chat(使用 oobabooga/text- Generation-webui) | 9.65 | 2023/07/20 | https://huggingface.co/TheBloke/Llama-2-13B-chat-GGML | |
Wizard-vicuna-13B.ggml.q4_0(使用 llama.cpp) | 9.63 | 2023/05/07 | https://huggingface.co/TheBloke/wizard-vicuna-13B-GGML | |
Nous-Capybara-7B(使用 oobabooga/text- Generation-webui) | 9.56 | 2023/10/08 | https://huggingface.co/TheBloke/Nous-Capybara-7B-GGUF | |
vicuna-13B-v1.5(使用 oobabooga/text- Generation-webui) | 9.53 | 2023/08/09 | https://huggingface.co/TheBloke/vicuna-13B-v1.5-GGML | |
WizardLM-13B-1.0-GPTQ(使用 oobabooga/text- Generation-webui) | 9.53 | 2023/05/29 | https://huggingface.co/TheBloke/wizardLM-13B-1.0-GPTQ | |
airoboros-13B-gpt4-1.4-GPTQ(使用 oobabooga/text- Generation-webui) | 9.50 | 2023/06/30 | https://huggingface.co/TheBloke/airoboros-13B-gpt4-1.4-GPTQ | |
Nous-Hermes-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 9.44 | 2023/06/03 | https://huggingface.co/TheBloke/Nous-Hermes-13B-GPTQ/tree/main | |
Dolphin-Llama-13B(使用 oobabooga/text- Generation-webui) | 9.38 | 2023/07/24 | https://huggingface.co/TheBloke/Dolphin-Llama-13B-GGML | |
Mistral-7B-Instruct-v0.1(使用 oobabooga/text- Generation-webui) | 9.37 | 2023/10/08 | https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF | |
OpenOrca-Platypus2-13B(使用 oobabooga/text- Generation-webui) | 9.37 | 2023/08/15 | https://huggingface.co/TheBloke/OpenOrca-Platypus2-13B-GGML | |
airoboros-l2-13b-gpt4-2.0(使用 oobabooga/text- Generation-webui) | 9.34 | 2023/08/01 | https://huggingface.co/TheBloke/airoboros-l2-13b-gpt4-2.0-GGML | |
Chronos-13B-v2(使用 oobabooga/text- Generation-webui) | 9.31 | 2023/08/09 | https://huggingface.co/TheBloke/Chronos-13B-v2-GGML | |
vicuna-13b-v1.3.0-GPTQ(使用 oobabooga/text- Generation-webui) | 9.31 | 2023/06/29 | https://huggingface.co/TheBloke/vicuna-13b-v1.3.0-GPTQ | |
MythoLogic-13B(使用 oobabooga/text- Generation-webui) | 9.31 | 2023/07/20 | https://huggingface.co/TheBloke/MythoLogic-13B-GGML | |
Selfee-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 9.28 | 2023/06/07 | https://huggingface.co/TheBloke/Selfee-13B-GPTQ | |
WizardLM-13B-V1.2(使用 oobabooga/text- Generation-webui) | 9.28 | 2023/07/26 | https://huggingface.co/TheBloke/WizardLM-13B-V1.2-GGML | |
minotaur-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 9.28 | 2023/06/09 | https://huggingface.co/TheBloke/minotaur-13B-GPTQ | |
Pygmalion-2-13B-SuperCOT2(使用 oobabooga/text- Generation-webui) | 9.20 | 2023/09/21 | https://huggingface.co/TheBloke/Pygmalion-2-13B-SuperCOT2-GGUF | |
Athena-v1(使用 oobabooga/text- Generation-webui) | 9.19 | 2023/08/31 | https://huggingface.co/TheBloke/Athena-v1-GGUF | |
PuddleJumper-13B(使用 oobabooga/text- Generation-webui) | 9.1875 | 2023/08/29 | https://huggingface.co/TheBloke/PuddleJumper-13B-GGUF | |
Nous-Hermes-Llama2(使用 oobabooga/text- Generation-webui) | 9.17 | 2023/07/24 | https://huggingface.co/TheBloke/Nous-Hermes-Llama2-GGML | |
Luban-13B(使用 oobabooga/text- Generation-webui) | 9.15 | 2023/08/31 | https://huggingface.co/TheBloke/Luban-13B-GGUF | |
MythoBoros-13B(使用 oobabooga/text- Generation-webui) | 9.15 | 2023/07/24 | https://huggingface.co/TheBloke/MythoBoros-13B-GGML | |
13B-Ouroboros(使用 oobabooga/text- Generation-webui) | 9.11 | 2023/07/27 | https://huggingface.co/TheBloke/13B-Ouroboros-GGML | |
tulu-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 9.06 | 2023/06/13 | https://huggingface.co/TheBloke/tulu-13B-GPTQ | |
AlpacaCielo-13B(使用 oobabooga/text- Generation-webui) | 9.03125 | 2023/07/27 | https://huggingface.co/TheBloke/AlpacaCielo-13B-GGML | |
StableBeluga-13B(使用 oobabooga/text- Generation-webui) | 9 | 2023/08/01 | https://huggingface.co/TheBloke/StableBeluga-13B-GGML | |
Chronos-Hermes-13B-v2(使用 oobabooga/text- Generation-webui) | 8.97 | 2023/08/10 | https://huggingface.co/TheBloke/Chronos-Hermes-13B-v2-GGML | |
OpenBuddy-Llama2-13B-v11.1(使用 oobabooga/text- Generation-webui) | 8.97 | 2023/09/05 | https://huggingface.co/TheBloke/OpenBuddy-Llama2-13B-v11.1-GGUF | |
Camel-Platypus2-13B(使用 oobabooga/text- Generation-webui) | 8.94 | 2023/08/15 | https://huggingface.co/TheBloke/Camel-Platypus2-13B-GGML | |
airoboros-l2-13b-gpt4-m2.0(使用 oobabooga/text- Generation-webui) | 8.94 | 2023/09/21 | https://huggingface.co/TheBloke/airoboros-l2-13b-gpt4-m2.0-GGUF | |
UltraLM-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 8.89 | 2023/06/30 | https://huggingface.co/TheBloke/UltraLM-13B-GPTQ | |
13B-HyperMantis_GPTQ(使用 oobabooga/text- Generation-webui) | 8.88 | 2023/06/03 | https://huggingface.co/digitous/13B-HyperMantis_GPTQ_4bit-128g/ | |
Stable-Platypus2-13B(使用 oobabooga/text- Generation-webui) | 8.875 | 2023/08/15 | https://huggingface.co/TheBloke/Stable-Platypus2-13B-GGML | |
Airoboros-13B-GPTQ-4bit(使用 oobabooga/text- Generation-webui) | 8.84 | 2023/05/25 | https://huggingface.co/TheBloke/airoboros-13B-GPTQ | |
Kuchiki-1.1-L2-7B(使用 oobabooga/text- Generation-webui) | 8.84 | 2023/09/21 | https://huggingface.co/TheBloke/Kuchiki-1.1-L2-7B-GGUF | |
WizardLM-1.0-Uncensored-Llama2-13B(使用 oobabooga/text- Generation-webui) | 8.80625 | 2023/08/09 | https://huggingface.co/TheBloke/WizardLM-1.0-Uncensored-Llama2-13B-GGML | |
Chronos-Beluga-v2-13B(使用 oobabooga/text- Generation-webui) | 8.75 | 2023/08/10 | https://huggingface.co/TheBloke/Chronos-Beluga-v2-13B-GGML | |
Vicuna-13B-CoT-GPTQ(使用 oobabooga/text- Generation-webui) | 8.75 | 2023/06/09 | https://huggingface.co/TheBloke/Vicuna-13B-CoT-GPTQ | |
WizardLM-7B.q4_2(在 GPT4All 中) | 8.75 | 不 | 2023/05/07 | https://gpt4all.io/models/ggml-wizardLM-7B.q4_2.bin |
OpenChat_v3.2(使用 oobabooga/text- Generation-webui) | 8.71875 | 2023/08/01 | https://huggingface.co/TheBloke/OpenChat_v3.2-GGML | |
Huginn-13B(使用 oobabooga/text- Generation-webui) | 8.7125 | 2023/08/10 | https://huggingface.co/TheBloke/Huginn-13B-GGML | |
WizardLM-13B-V1.1(使用 oobabooga/text- Generation-webui) | 8.66 | 2023/07/17 | https://huggingface.co/TheBloke/WizardLM-13B-V1.1-GGML | |
robin-13B-v2-GPTQ(使用 oobabooga/text- Generation-webui) | 8.66 | 2023/06/19 | https://huggingface.co/TheBloke/robin-13B-v2-GPTQ | |
llama-2-13B-Guanaco-QLoRA(使用 oobabooga/text- Generation-webui) | 8.625 | 2023/07/21 | https://huggingface.co/TheBloke/llama-2-13B-Guanaco-QLoRA-GGML | |
mpt-7b-chat(在 GPT4All 中) | 8.53 | 不 | 2023/05/11 | https://gpt4all.io/models/ggml-mpt-7b-chat.bin |
chronos-hermes-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 8.48125 | 2023/06/16 | https://huggingface.co/TheBloke/chronos-hermes-13B-GPTQ | |
Luna-AI-Llama2-Uncensored(使用 oobabooga/text- Generation-webui) | 8.46875 | 2023/07/20 | https://huggingface.co/TheBloke/Luna-AI-Llama2-Uncensored-GGML | |
stable-vicuna-13B-GPTQ-4bit-128g(使用 oobabooga/text- Generation-webui) | 8.25 | 2023/05/12 | https://huggingface.co/TheBloke/stable-vicuna-13B-GPTQ | |
manticore_13b_chat_pyg_GPTQ(使用 oobabooga/text- Generation-webui) | 8.21875 | 2023/05/24 | https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ | |
CAMEL_13B_Combined_Data_GPTQ(使用 oobabooga/text- Generation-webui) | 8.09375 | 2023/06/10 | https://huggingface.co/TheBloke/CAMEL-13B-Combined-Data-GPTQ | |
WizardLM-Uncensored-Falcon-7B-GPTQ(使用 oobabooga/text- Generation-webui) | 8.09375 | 2023/06/02 | https://huggingface.co/TheBloke/WizardLM-Uncensored-Falcon-7B-GPTQ | |
llama-13b-supercot-GGML(使用 oobabooga/text- Generation-webui) | 8.01 | 2023/07/05 | https://huggingface.co/TheBloke/llama-13b-supercot-GGML | |
Project-Baize-v2-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 7.96875 | 2023/05/24 | https://huggingface.co/TheBloke/Project-Baize-v2-13B-GPTQ | |
koala-13B-4bit-128g.GGML(使用 llama.cpp) | 7.9375 | 不 | 2023/05/07 | https://huggingface.co/TheBloke/koala-13B-GPTQ-4bit-128g-GGML |
Wizard-lm-uncensored-13b-GPTQ-4bit-128g(使用 oobabooga/text- Generation-webui) | 7.90625 | 2023/05/19 | https://huggingface.co/4bit/WizardLM-13B-Uncensored-4bit-128g | |
vicuna-7B-v1.3-GPTQ(使用 oobabooga/text- Generation-webui) | 7.875 | 2023/06/29 | https://huggingface.co/TheBloke/vicuna-7B-v1.3-GPTQ | |
Manticore-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 7.78125 | 2023/05/23 | https://huggingface.co/TheBloke/Manticore-13B-GPTQ | |
vicuna-13b-1.1-q4_2(在 GPT4All 中) | 7.75 | 不 | 2023/05/07 | https://gpt4all.io/models/ggml-vicuna-13b-1.1-q4_2.bin |
falcon-7b-instruct-GPTQ(使用 oobabooga/text- Generation-webui) | 7.625 | 2023/06/02 | https://huggingface.co/TheBloke/falcon-7b-instruct-GPTQ | |
guanaco-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 7.5625 | 2023/05/26 | https://huggingface.co/TheBloke/guanaco-13B-GPTQ | |
Mythical-Destroyer-V2-L2-13B(使用 oobabooga/text- Generation-webui) | 7.31 | 2023/08/31 | https://huggingface.co/TheBloke/Mythical-Destroyer-V2-L2-13B-GGUF | |
Kimiko-v2-13B(使用 oobabooga/text- Generation-webui) | 7.25 | 2023/08/31 | https://huggingface.co/TheBloke/Kimiko-v2-13B-GGUF | |
orca-mini-13b.ggmlv3.q5_K_M(使用 oobabooga/text- Generation-webui) | 7.0875 | 2023/06/28 | https://huggingface.co/TheBloke/orca_mini_13B-GGML | |
Platypus2-13B(使用 oobabooga/text- Generation-webui) | 7.03125 | 2023/08/15 | https://huggingface.co/TheBloke/Platypus2-13B-GGML | |
Redmond-Puffin-13B(使用 oobabooga/text- Generation-webui) | 7.03125 | 2023/07/20 | https://huggingface.co/TheBloke/Redmond-Puffin-13B-GGML | |
13B-BlueMethod(使用 oobabooga/text- Generation-webui) | 7.025 | 2023/07/24 | https://huggingface.co/TheBloke/13B-BlueMethod-GGML | |
mpt-7b-指令 | 6.6875 | 不 | 2023/05/12 | https://huggingface.co/TheBloke/MPT-7B-Instruct-GGML |
Kimiko-13B(使用 oobabooga/text- Generation-webui) | 6.46875 | 2023/08/01 | https://huggingface.co/TheBloke/Kimiko-13B-GGML | |
gpt4-x-alpaca-13b-ggml-q4_0(使用llama.cpp) | 6.0625 | 不 | 2023/05/07 | https://huggingface.co/Bradarr/gpt4-x-alpaca-13b-native-ggml-model-q4_0 |
minotaur-15B-GPTQ(使用 oobabooga/text- Generation-webui) | 5.9375 | 2023/06/26 | https://huggingface.co/TheBloke/minotaur-15B-GPTQ | |
baichuan-vicuna-7B-GGML(使用 oobabooga/text- Generation-webui) | 5.90625 | 2023/07/05 | https://huggingface.co/TheBloke/baichuan-vicuna-7B-GGML | |
gpt4all-j-v1.3-groovy(在 GPT4All 中) | 5.6875 | 不 | 2023/05/07 | https://gpt4all.io/models/ggml-gpt4all-j-v1.3-groovy.bin |
非常感谢:
❤️ GPT4ALl:https://github.com/nomic-ai/gpt4all-chat
❤️ llama.cpp:https://github.com/ggerganov/llama.cpp
❤️ oobabooga 文本生成 webui:https://github.com/oobabooga/text- Generation-webui
❤️ Colab webui 灵感来自camenduru:https://github.com/camenduru/text- Generation-webui-colab/tree/main
❤️ 用于模型量化的 Bloke:https://huggingface.co/TheBloke
(所有分数均来自 GPT-4-0613。)
型号_名称 | 平均分数 | Colab_Link | 添加日期 | 关联 |
---|---|---|---|---|
CodeLlama-13B-oasst-sft-v10(使用 oobabooga/text- Generation-webui) | 9.8 | 2023/08/28 | https://huggingface.co/TheBloke/CodeLlama-13B-oasst-sft-v10-GGUF | |
WizardCoder-Python-13B-V1.0(使用 oobabooga/text- Generation-webui) | 9.5 | 2023/08/28 | https://huggingface.co/TheBloke/WizardCoder-Python-13B-V1.0-GGUF | |
Redmond-Hermes-Coder-GPTQ(使用 oobabooga/text- Generation-webui) | 8.4 | 2023/07/03 | https://huggingface.co/TheBloke/Redmond-Hermes-Coder-GPTQ | |
CodeUp-Alpha-13B-HF(使用 oobaboga/text- Generation-webui) | 7.9 | 2023/08/15 | https://huggingface.co/TheBloke/CodeUp-Alpha-13B-HF-GGML | |
starat-beta-GPTQ(使用 oobabooga/text- Generation-webui) | 7.6 | 2023/07/04 | https://huggingface.co/TheBloke/starchat-beta-GPTQ | |
Wizard-vicuna-13B-GPTQ(使用 oobabooga/text- Generation-webui) | 7.3 | 2023/07/03 | https://huggingface.co/TheBloke/wizard-vicuna-13B-GPTQ | |
WizardCoder-Guanaco-15B-V1.1(使用 oobabooga/text- Generation-webui) | 7.1 | 2023/07/21 | https://huggingface.co/TheBloke/WizardCoder-Guanaco-15B-V1.1-GPTQ | |
CodeLlama-13B-Instruct(使用 oobabooga/text- Generation-webui) | 7 | 2023/08/28 | https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-GGUF | |
CodeUp-Llama-2-13B-Chat-HF(使用 oobabooga/text- Generation-webui) | 6 | 2023/08/03 | https://huggingface.co/TheBloke/CodeUp-Llama-2-13B-Chat-HF-GGML | |
WizardCoder-15B-1.0-GPTQ(使用 oobabooga/text- Generation-webui) | 5.9 | 2023/07/03 | https://huggingface.co/TheBloke/WizardCoder-15B-1.0-GPTQ | |
WizardLM-13B-1.0-GPTQ(使用 oobabooga/text- Generation-webui) | 5.9 | 2023/05/29 | https://huggingface.co/TheBloke/wizardLM-13B-1.0-GPTQ |
原始回复可在以下网址找到:https://docs.google.com/spreadsheets/d/1ogDXUiaBx3t7EpMo44aaA6U6kLXX0x2tGRgLg8CISGs/edit?usp=sharing
任务领域:翻译
预期良好响应:“Le soleil se lève à l'est et se couche à l'ouest。”
说明:此任务测试模型理解和准确翻译语言之间文本的能力。
任务领域:摘要
预期良好反应:“水循环是地球上水的持续运动,对于维持水平衡和支持生命至关重要。”
说明:此任务评估模型从给定文本中提取要点并生成简洁摘要的能力。
任务域:应用程序设计
说明:此任务评估了模型密切遵守用户指令进行复杂任务的能力。
任务域:抽象问题回答
预期的回应:金融危机,社会不平等和启蒙思想。
说明:此任务测试了模型理解上下文并用自己的文字产生答案的能力。