Jacaranda commited on
Commit
8bcd68c
1 Parent(s): 4bea6af

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -14,7 +14,7 @@ pipeline_tag: question-answering
14
 
15
 
16
  ## Model Details
17
- UlizaLlama7b-1 is a language model that builds upon the foundation of [Jacaranda/kiswallama-pretrained7B](https://huggingface.co/Jacaranda/kiswallama-pretrained). Jacaranda/kiswallama-pretrained is a large language model continually-pretrained with 321,530,045 swahili tokens and a customized tokenizer with a swahili vocabulary of 20,000 tokens to extend the capabilities of [Meta/Llama2](https://huggingface.co/meta-llama/Llama-2-7b). It offers significant improvements in both encoding and decoding for Swahili text, surpassing the Swahili performance of Meta/Llama2. Moreover, Jacaranda/kiswallama-pretrained excels in providing accurate next-word completions in Swahili, a capability which Meta/Llama2 falls short of.
18
  ### Model Description
19
  - Origin: Adaptation of the Jacaranda/kiswallama-pretrained model.
20
  - Data: Instructional dataset in Swahili and English consisting of prompt-response pairs.
@@ -25,14 +25,14 @@ pipeline_tag: question-answering
25
 
26
 
27
  - **Developed by:** [Jacaranda Health](https://www.jacarandahealth.org/)
28
- - **Funded by [optional]:** [Google AI For Social Good Grant]
29
- - **Model type:** [LlamaModelForCausalLm]
30
- - **Language(s) (NLP):** [English and Swahili]
31
- - **License:** [to include]
32
- - **Model Developers:** [Stanslaus Mwongela, Jay Patel, Sathy Rajasekharan]
33
- - **Finetuned from model:** [ Jacaranda/kiswallama-pretrained model which builds upon Meta/Llama2]
34
  ## Uses
35
- UlizaLlama7b-1 is optimized for downstream tasks, notably those demanding instructional datasets in Swahili, English, or both. Organizations can further fine-tune it for their specific domains. Potential areas include:
36
  - Question-answering within specific domains.
37
  - Assistant-driven chat capabilities: healthcare, agriculture, legal, education, tourism and hospitality, public services, financial sectors, communication, customer assistance, commerce, etcpublic services, financial sectors, communication, customer assistance, commerce, etc.
38
 
 
14
 
15
 
16
  ## Model Details
17
+ UlizaLlama is a 7B Parameters language model that builds upon the foundation of [Jacaranda/kiswallama-pretrained7B](https://huggingface.co/Jacaranda/kiswallama-pretrained). Jacaranda/kiswallama-pretrained is a large language model continually-pretrained with 321,530,045 swahili tokens and a customized tokenizer with a swahili vocabulary of 20,000 tokens to extend the capabilities of [Meta/Llama2](https://huggingface.co/meta-llama/Llama-2-7b). It offers significant improvements in both encoding and decoding for Swahili text, surpassing the Swahili performance of Meta/Llama2. Moreover, Jacaranda/kiswallama-pretrained excels in providing accurate next-word completions in Swahili, a capability which Meta/Llama2 falls short of.
18
  ### Model Description
19
  - Origin: Adaptation of the Jacaranda/kiswallama-pretrained model.
20
  - Data: Instructional dataset in Swahili and English consisting of prompt-response pairs.
 
25
 
26
 
27
  - **Developed by:** [Jacaranda Health](https://www.jacarandahealth.org/)
28
+ - **Funded by [optional]:** [Google.org](https://www.google.org/)
29
+ - **Model type:** [LlamaModel](https://huggingface.co/models?other=llama)
30
+ - **Language(s) (NLP):** Swahili and English
31
+ - **License:** [CC BY-NC-SA 4.0 DEED](http://creativecommons.org/licenses/by-nc-sa/4.0/)
32
+ - **Model Developers:** Stanslaus Mwongela, Jay Patel, Sathy Rajasekharan
33
+ - **Finetuned from model:** [ Jacaranda/kiswallama-pretrained model](https://huggingface.co/Jacaranda/kiswallama-pretrained) which builds upon [Meta/Llama2](https://huggingface.co/meta-llama/Llama-2-7b)
34
  ## Uses
35
+ UlizaLlama is optimized for downstream tasks, notably those demanding instructional datasets in Swahili, English, or both. Organizations can further fine-tune it for their specific domains. Potential areas include:
36
  - Question-answering within specific domains.
37
  - Assistant-driven chat capabilities: healthcare, agriculture, legal, education, tourism and hospitality, public services, financial sectors, communication, customer assistance, commerce, etcpublic services, financial sectors, communication, customer assistance, commerce, etc.
38