Update README.md
Browse files
README.md
CHANGED
@@ -94,6 +94,7 @@ datasets:
|
|
94 |
Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
95 |
* 6 X [Felladrin/Minueza-32M-Chat](https://huggingface.co/Felladrin/Minueza-32M-Chat)
|
96 |
* Num Experts Per Token : 3
|
|
|
97 |
|
98 |
## 💻 Usage
|
99 |
|
|
|
94 |
Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
95 |
* 6 X [Felladrin/Minueza-32M-Chat](https://huggingface.co/Felladrin/Minueza-32M-Chat)
|
96 |
* Num Experts Per Token : 3
|
97 |
+
* [Evaluation Results](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE)
|
98 |
|
99 |
## 💻 Usage
|
100 |
|