Chinese-Mixtral

Chinese Mixtral GitHub repository: https://github.com/ymcui/Chinese-Mixtral

This repository contains Chinese-Mixtral, which is further pre-trained on Mixtral-8x7B-v0.1.

Note: this is a foundation model, which is not suitable for conversation, QA, etc.

Others

Citation

Please consider cite our paper if you use the resource of this repository. Paper link: https://arxiv.org/abs/2403.01851

@article{chinese-mixtral,
      title={Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtral}, 
      author={Cui, Yiming and Yao, Xin},
      journal={arXiv preprint arXiv:2403.01851},
      url={https://arxiv.org/abs/2403.01851},
      year={2024}
}
Downloads last month
1,168
Safetensors
Model size
46.7B params
Tensor type
BF16
·
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including hfl/chinese-mixtral