szymonrucinski
commited on
Commit
•
4e19440
1
Parent(s):
7c8c7e2
Update README.md
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ language:
|
|
5 |
---
|
6 |
# Model Card for Krakowiak-v2-7b
|
7 |
|
8 |
-
Krakowiak-v2-7b is a state of the art 7.3 billion parameters LLM based on Mistral-7B. It was finetuned for Polish text generation using custom created large corpus of 100K Polish instructions. It uses novel techniques e.g. LORA, adding noise to the embeddings for
|
9 |
|
10 |
## Model Architecture
|
11 |
|
@@ -46,6 +46,8 @@ pipe = pipeline("text-generation", model="szymonrucinski/krakowiak-v2-7b")
|
|
46 |
pipe("<s>[INST] Też lubisz jeździć na rowerze? [/INST]")
|
47 |
```
|
48 |
|
|
|
|
|
49 |
|
50 |
## Krakowiak team
|
51 |
|
|
|
5 |
---
|
6 |
# Model Card for Krakowiak-v2-7b
|
7 |
|
8 |
+
Krakowiak-v2-7b is a state of the art 7.3 billion parameters LLM based on Mistral-7B. It was finetuned for Polish text generation using custom created large corpus of 100K Polish instructions. It uses novel techniques e.g. LORA, adding noise to the embeddings for greater preformance. For full details of this model please read our [paper to be released soon](www.example.come)
|
9 |
|
10 |
## Model Architecture
|
11 |
|
|
|
46 |
pipe("<s>[INST] Też lubisz jeździć na rowerze? [/INST]")
|
47 |
```
|
48 |
|
49 |
+
## Demo
|
50 |
+
You can play with Krakowiak-v2-7b [here](https://huggingface.co/spaces/szymonrucinski/krakowiak). This model uses 4-bit quantization CPU inference that negatively improve the quality of the output but are more cost effective. You can run Krakowiak on your CPU using its quantized version availabe [here](https://huggingface.co/szymonrucinski/krakowiak-v2-7b-gguf)
|
51 |
|
52 |
## Krakowiak team
|
53 |
|