Ruciński commited on
Commit
111adf5
1 Parent(s): a3c29b2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -1
README.md CHANGED
@@ -1,8 +1,14 @@
1
  ---
2
  library_name: peft
3
  ---
4
- ## Training procedure
 
 
 
5
 
 
 
 
6
 
7
  The following `bitsandbytes` quantization config was used during training:
8
  - load_in_8bit: False
 
1
  ---
2
  library_name: peft
3
  ---
4
+ ## Introduction
5
+ Krakowiak-7B is a finetuned version of Meta's [Llama2](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf). It was trained on the modified and updated dataset originally created by [Chris Ociepa](https://huggingface.co/datasets/szymonindy/ociepa-raw-self-generated-instructions-pl)
6
+ containing ~ 50K Polish instructions. Making it one of the best and biggest available LLM's for polish language.
7
+ Name [krakowiak](https://www.youtube.com/watch?v=OeQ6jYzt6cM) refers to one of the most popular and characteristic Polish folk dances, with its very lively, even wild, tempo, and long, easy strides, demonstrating spirited abandon and elegance at the same time.
8
 
9
+ ## How to test it?
10
+ The model can be ran using the Huggingface library or in the browser using this [Google Colab](https://colab.research.google.com/drive/1IM7j57g9ZHj-Pw2EXGyacNuKHjvK3pIc?usp=sharing)
11
+ ## Training procedure
12
 
13
  The following `bitsandbytes` quantization config was used during training:
14
  - load_in_8bit: False