File size: 1,458 Bytes
e4d9dec
 
 
 
 
 
 
 
 
 
3f348f3
 
28b1817
e4d9dec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28b1817
e4d9dec
3f348f3
 
e4d9dec
28b1817
e4d9dec
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
license: apache-2.0
language:
- zh
- en
library_name: transformers
pipeline_tag: text-generation
---
原始模型:https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b  

lora:https://huggingface.co/ziqingyang/chinese-llama-plus-lora-7b    
https://huggingface.co/ziqingyang/chinese-alpaca-plus-lora-7b  
将pygmalion-7b与chinese-llama-plus-lora-7b和chinese-alpaca-plus-lora-7b进行合并,增强模型的中文能力,~~不过存在翻译腔~~

使用项目:
https://github.com/ymcui/Chinese-LLaMA-Alpaca

https://github.com/qwopqwop200/GPTQ-for-LLaMa

**兼容AutoGPTQ和GPTQ-for-LLaMa**  
**若选择GPTQ-for-LLaMa加载,请设置 Wbits=4 groupsize=128 model_type=llama**


Text-generation-webui懒人包:
https://www.bilibili.com/read/cv23495183

---

Original model: https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b  

lora:https://huggingface.co/ziqingyang/chinese-llama-plus-lora-7b    
https://huggingface.co/ziqingyang/chinese-alpaca-plus-lora-7b  

The pygmalion-7b model is combined with the chinese-llama-plus-lora-7b and chinese-alpaca-plus-lora-7b to enhance the model's Chinese language capabilities, ~~although there may be some translated tone~~.

Usage projects:
https://github.com/ymcui/Chinese-LLaMA-Alpaca

https://github.com/qwopqwop200/GPTQ-for-LLaMa

**Compatible with AutoGPTQ and GPTQ-for-LLaMa**  
**If you choose to load GPTQ-for-LLaMa, please set Wbits=4 groupsize=128 model_type=llama**