Edit model card

continued pretraining with wikipedia-zh

dataset = load_dataset("wikimedia/wikipedia", "20231101.zh", split = "train",)

and sft with FreedomIntelligence/alpaca-gpt4-zh

alpaca_dataset = load_dataset("FreedomIntelligence/alpaca-gpt4-zh", split = "train")

using unsloth to fine-tuning the model

use the model-unsloth.gguf file or model-unsloth-Q4_K_M.gguf file in llama.cpp or a UI based system like GPT4All.

aslo you can make the ollama model with modelfile

template

以下是描述一项任务的指令。请输出合适的内容回应指令。
### Input:
大象和猎豹的奔跑速度谁更快,简单说明原因.

### Response:
猎豹的奔跑速度比大象快得多。

猎豹:是世界上奔跑速度最快的陆地动物之一,短距离内可以达到约 100-120 公里/小时(约 60-75 英里/小时)。
大象:虽然大象体型巨大,但它们的速度较慢,奔跑速度最高约为 40 公里/小时(约 25 英里/小时)。
因此,猎豹在速度上远远超过了大象。

### Input:
鱼为什么能在水里呼吸。

### Response:
鱼能够在水中呼吸,主要是因为它们有一种特殊的呼吸器官——鳃。鳃能够从水中提取氧气,并排出二氧化碳,这个过程使鱼能够在水中生存。
{{ if .Prompt }}

### 指令:
{{ .Prompt }}

{{ end }}### 回应:
{{ .Response }}<|end_of_text|>

or just use ollama run lastmass/llama3.2-chinese

Community Feedback Request

Dear users who have downloaded this model,

I've noticed a significant increase in downloads recently, which I truly appreciate. If possible, I would be grateful if you could share your experience and feedback in our community. Your insights would be invaluable for future improvements.

Thank you for your support!


Downloads last month
11,179
GGUF
Model size
3.61B params
Architecture
llama

4-bit

16-bit

Inference API
Unable to determine this model's library. Check the docs .

Model tree for lastmass/llama3.2-chinese

Quantized
(26)
this model

Dataset used to train lastmass/llama3.2-chinese