--- license: apache-2.0 language: - ar tags: - ArabianGPT widget: - text: أعلنت وزارة الحج في المملكة العربية السعودية example_title: مثال ١ - text: يبدو اليوم جميلا، سأقوم بتحضير example_title: مثال ٢ - text: إن التقنيات الحديثة example_title: مثال ٣ library_name: transformers pipeline_tag: text-generation --- # ArabianGPT 1.5B Model Overview ## Disclaimer for the Use of Large Language Models (LLMs) for Text Generation
We disclaim all responsibility for any harm, inaccuracies, or inappropriate content generated by ArabianGPT-1.5B, and users engage with and apply the model's outputs at their own risk.
> **Important Note:** Currently, we offer a raw pre-trained model. Our team is actively working on releasing instruction-based LLMs that are fine-tuned and augmented with LRHF. The first set of pre-trained models has been made available for community exploration. While we do have models fine-tuned for specific tasks such as summarization and sentiment analysis, they are still in the development phase. ## How you can use this Pre-Trained Model? You are invited to utilize this pre-trained, native Arabic language model as an experimental tool to assess its capabilities, aid in its fine-tuning, and evaluate its performance across a variety of downstream tasks. We encourage you to review our technical report for a comprehensive understanding of the model's performance metrics and the specific downstream tasks it has been tested on. This will provide valuable insights into its applicability and effectiveness in diverse applications. ## Introduction ArabianGPT-1.5B, part of the ArabianLLM initiatives, is a specialized GPT model optimized for the Arabic language. Developed at Prince Sultan University's Robotics and Internet of Things Lab, this model is a significant advancement in natural language modeling and generation for Arabic, addressing the language's unique challenges. ## Key Features - **Architecture**: GPT-2 - **Model Size**: 1.558 billion parameters - **Layers**: 48 - **Model Attention Layers (MAL)**: 25 - **Context Window Size**: 1024 tokens ## Training - **Dataset**: over 30 billion tokens from a web-scraped dataset - **Tokenizer**: Aranizer 64K - **Hardware**: 6 NVIDIA A100 GPUs ## Role in ArabianLLM Initiatives ArabianGPT-1.5B is crucial for advancing Arabic language processing, addressing challenges unique to Arabic morphology and dialects. ## Usage Suitable for Arabic text generation tasks. Example usage with Transformers Pipeline: ```python from transformers import pipeline pipe = pipeline("text-generation", model="riotu-lab/ArabianGPT-1.5B", max_new_tokens=1024) text = '' pipe(text) ``` ## Limitations and Ethical Considerations - The model may have context understanding or text generation limitations in certain scenarios. - Emphasis on ethical use to prevent misinformation or harmful content propagation. ## Acknowledgments Special thanks to Prince Sultan University, particularly the Robotics and Internet of Things Lab. ## Contact Information For inquiries: [riotu@psu.edu.sa](mailto:riotu@psu.edu.sa). ## Disclaimer for the Use of Large Language Models (LLMs) for Text GenerationWe disclaim all responsibility for any harm, inaccuracies, or inappropriate content generated by ArabianGPT-1.5B, and users engage with and apply the model's outputs at their own risk.