Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ tags:
|
|
10 |
|
11 |
|
12 |
# lobotollama-368b prune [Meta-Llama-3.1-405B-Base](https://huggingface.co/meta-llama/Meta-Llama-3.1-405B).
|
13 |
-
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
14 |
|
15 |
# Just so you meow, this did not turn out all that great in the perplexity benchmarks. Needs healing, you'll probably need 32xh100 to do a full finetune.
|
16 |
# Model was designed to fin in a M2 mac-studio 192gb in 4bit.
|
|
|
10 |
|
11 |
|
12 |
# lobotollama-368b prune [Meta-Llama-3.1-405B-Base](https://huggingface.co/meta-llama/Meta-Llama-3.1-405B).
|
13 |
+
This is a negative-merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
14 |
|
15 |
# Just so you meow, this did not turn out all that great in the perplexity benchmarks. Needs healing, you'll probably need 32xh100 to do a full finetune.
|
16 |
# Model was designed to fin in a M2 mac-studio 192gb in 4bit.
|