Post
Repo with scripts to create your own moe models using Apple mlx is constantly updated by
@mzbac
here: https://github.com/mzbac/mlx-moe
It's an amazing resource to learn inner workings of lora on moe with mlx.
It uses https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_70k as default dataset, but can be easily tweak to use any model or dataset fro HF
Have fun with it!
It's an amazing resource to learn inner workings of lora on moe with mlx.
It uses https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_70k as default dataset, but can be easily tweak to use any model or dataset fro HF
Have fun with it!