Update README.md
Browse files
README.md
CHANGED
@@ -12,11 +12,11 @@ inference: false
|
|
12 |
[<img src="dicta-logo.jpg" width="300px"/>](https://dicta.org.il)
|
13 |
|
14 |
|
15 |
-
#
|
16 |
|
17 |
The DictaLM-2.0 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters trained to specialize in Hebrew text.
|
18 |
|
19 |
-
For full details of this model please read our [release blog post](https://dicta.org.il/dicta-lm).
|
20 |
|
21 |
This model contains the GPTQ 4-bit quantized version of the base model [DictaLM-2.0](https://huggingface.co/dicta-il/dictalm2.0).
|
22 |
|
@@ -66,5 +66,13 @@ DictaLM 2.0 is a pretrained base model and therefore does not have any moderatio
|
|
66 |
If you use this model, please cite:
|
67 |
|
68 |
```bibtex
|
69 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
70 |
```
|
|
|
12 |
[<img src="dicta-logo.jpg" width="300px"/>](https://dicta.org.il)
|
13 |
|
14 |
|
15 |
+
# Adapting LLMs to Hebrew: Unveiling DictaLM 2.0 with Enhanced Vocabulary and Instruction Capabilities
|
16 |
|
17 |
The DictaLM-2.0 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters trained to specialize in Hebrew text.
|
18 |
|
19 |
+
For full details of this model please read our [release blog post](https://dicta.org.il/dicta-lm) or the [technical report](https://arxiv.org/abs/2407.07080).
|
20 |
|
21 |
This model contains the GPTQ 4-bit quantized version of the base model [DictaLM-2.0](https://huggingface.co/dicta-il/dictalm2.0).
|
22 |
|
|
|
66 |
If you use this model, please cite:
|
67 |
|
68 |
```bibtex
|
69 |
+
@misc{shmidman2024adaptingllmshebrewunveiling,
|
70 |
+
title={Adapting LLMs to Hebrew: Unveiling DictaLM 2.0 with Enhanced Vocabulary and Instruction Capabilities},
|
71 |
+
author={Shaltiel Shmidman and Avi Shmidman and Amir DN Cohen and Moshe Koppel},
|
72 |
+
year={2024},
|
73 |
+
eprint={2407.07080},
|
74 |
+
archivePrefix={arXiv},
|
75 |
+
primaryClass={cs.CL},
|
76 |
+
url={https://arxiv.org/abs/2407.07080},
|
77 |
+
}
|
78 |
```
|