Jais and Jais-chat: Arabic-Centric Foundation and Instruction-Tuned Open Generative Large Language Models
Abstract
We introduce Jais and Jais-chat, new state-of-the-art Arabic-centric foundation and instruction-tuned open generative large language models (LLMs). The models are based on the GPT-3 decoder-only architecture and are pretrained on a mixture of Arabic and English texts, including source code in various programming languages. With 13 billion parameters, they demonstrate better knowledge and reasoning capabilities in Arabic than any existing open Arabic and multilingual models by a sizable margin, based on extensive evaluation. Moreover, the models are competitive in English compared to English-centric open models of similar size, despite being trained on much less English data. We provide a detailed description of the training, the tuning, the safety alignment, and the evaluation of the models. We release two open versions of the model -- the foundation Jais model, and an instruction-tuned Jais-chat variant -- with the aim of promoting research on Arabic LLMs. Available at https://huggingface.co/inception-mbzuai/jais-13b-chat
Community
Super cool to see multi-lingual LLMs, congrats to all who worked on this!
It's truly impressive to witness the development of multilingual Language Models (LLMs) specific in Arabic. Congratulations to everyone involved in this achievement!
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- PersianMind: A Cross-Lingual Persian-English Large Language Model (2024)
- Orion-14B: Open-source Multilingual Large Language Models (2024)
- TURNA: A Turkish Encoder-Decoder Language Model for Enhanced Understanding and Generation (2024)
- On the importance of Data Scale in Pretraining Arabic Language Models (2024)
- Turning English-centric LLMs Into Polyglots: How Much Multilinguality Is Needed? (2023)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend