Chinese Mixtral series
Collection
This collection hosts the Chinese-Mixtral LLMs, including Full weight, LoRA, and GGUF formats.
•
6 items
•
Updated
•
1
Chinese Mixtral GitHub repository: https://github.com/ymcui/Chinese-Mixtral
This repository contains Chinese-Mixtral, which is further pre-trained on Mixtral-8x7B-v0.1.
Note: this is a foundation model, which is not suitable for conversation, QA, etc.
For LoRA-only model, please see: https://huggingface.co/hfl/chinese-mixtral-lora
For GGUF model (llama.cpp compatible), please see: https://huggingface.co/hfl/chinese-mixtral-gguf
If you have questions/issues regarding this model, please submit an issue through https://github.com/ymcui/Chinese-Mixtral/.
Please consider cite our paper if you use the resource of this repository. Paper link: https://arxiv.org/abs/2403.01851
@article{chinese-mixtral,
title={Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtral},
author={Cui, Yiming and Yao, Xin},
journal={arXiv preprint arXiv:2403.01851},
url={https://arxiv.org/abs/2403.01851},
year={2024}
}