jisx commited on
Commit
35b80f1
1 Parent(s): 9443028

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+ library_name: transformers
4
+ pipeline_tag: text-generation
5
+ language:
6
+ - multilingual
7
+ tags:
8
+ - generation
9
+ - question answering
10
+ - instruction tuning
11
+ datasets:
12
+ - MBZUAI/Bactrian-X
13
+ license: cc-by-nc-4.0
14
+ ---
15
+
16
+ ### Model Description
17
+
18
+ This HF repository hosts instruction fine-tuned multilingual BLOOM model using the parallel instruction dataset called Bactrain-X in 52 languages.
19
+ We progressively add a language during instruction fine-tuning at each time, and train 52 models in total. Then, we evaluate those models in three multilingual benchmarks.
20
+
21
+ Please refer to our paper for more details.
22
+
23
+ #### Instruction tuning details
24
+ * Base model: [BLOOM 7B1](https://huggingface.co/bigscience/bloom-7b1)
25
+ * Instruction languages: English, Chinese, Afrikaans, Arabic, Azerbaijani, Bengali, Czech, German, Spanish, Estonian, Farsi, Finnish, French, Galician, Gujarati, Hebrew, Hindi, Croatian, Indonesian, Italian, Japanese, Georgian, Kazakh, Khmer, Korean, Lithuanian, Latvian, Macedonian, Malayalam, Mongolian, Marathi, Burmese, Nepali, Dutch, Polish, Pashto, Portuguese, Romanian, Russian, Sinhala, Slovenian, Swedish, Swahili, Tamil, Telugu
26
+ * Instruction language codes: en, zh, af, ar, az, bn, cs, de, es, et, fa, fi, fr, gl, gu, he, hi, hr, id, it, ja, ka, kk, km, ko, lt, lv, mk, ml, mn, mr, my, ne, nl, pl, ps, pt, ro, ru, si, sl, sv, sw, ta, te
27
+ * Training method: full-parameter fine-tuning.
28
+
29
+ #### Usage
30
+ The model checkpoint should be loaded using `transformers` library.
31
+
32
+ ```python
33
+ from transformers import AutoTokenizer, AutoModelForCausalLM
34
+
35
+ tokenizer = AutoTokenizer.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-45")
36
+ model = AutoModelForCausalLM.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-45")
37
+ ```
38
+
39
+ #### Citation
40
+ ```
41
+ @article{
42
+ }
43
+ ```
44
+