Kquant03 commited on
Commit
1e192fe
1 Parent(s): fd413a7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -10,6 +10,8 @@ tags:
10
  # "[We] are joined by the bonds of love. And you cannot track that, not with a thousand bloodhounds, and you cannot break it, not with a thousand swords."
11
  [GGUF FILES HERE](https://huggingface.co/Kquant03/Buttercup-4x7B-GGUF)
12
 
 
 
13
  A frankenMoE not only using far better methodology and fundamental understanding of SMoE, but completely focused around intellectual roleplay. It may have a bit of a redundancy issue (I have actually been playing with it while GGUF uploads on q8_k and it has nice variety). However, just in case, to battle this, try to keep things fresh with the model by either introducing new concepts often, or through [drμgs](https://github.com/EGjoni/DRUGS). (no not that kind)
14
 
15
  The config looks like this...(detailed version is in the files and versions):
@@ -19,7 +21,6 @@ The config looks like this...(detailed version is in the files and versions):
19
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
20
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
21
 
22
- [Join our Discord!](https://discord.gg/CAfWPV82)
23
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
24
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
25
 
 
10
  # "[We] are joined by the bonds of love. And you cannot track that, not with a thousand bloodhounds, and you cannot break it, not with a thousand swords."
11
  [GGUF FILES HERE](https://huggingface.co/Kquant03/Buttercup-4x7B-GGUF)
12
 
13
+ [Join our Discord!](https://discord.gg/CAfWPV82)
14
+
15
  A frankenMoE not only using far better methodology and fundamental understanding of SMoE, but completely focused around intellectual roleplay. It may have a bit of a redundancy issue (I have actually been playing with it while GGUF uploads on q8_k and it has nice variety). However, just in case, to battle this, try to keep things fresh with the model by either introducing new concepts often, or through [drμgs](https://github.com/EGjoni/DRUGS). (no not that kind)
16
 
17
  The config looks like this...(detailed version is in the files and versions):
 
21
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
22
  - [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
23
 
 
24
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
25
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
26