FalconLLM Phương commited on
Commit
ccbff8f
1 Parent(s): 308ced2

Update README.md (#1)

Browse files

- Update README.md (dce8ecfb89a6e047b999081b0b727d36fb8ae3dc)


Co-authored-by: Phương <[email protected]>

Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -10,7 +10,7 @@ inference: false
10
 
11
  **Falcon-7B is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b/blob/main/LICENSE.txt).**
12
 
13
- *Paper coming soon 😊.*
14
 
15
  ## Why use Falcon-7B?
16
 
@@ -198,7 +198,7 @@ Falcon-7B was trained a custom distributed training codebase, Gigatron. It uses
198
 
199
  ## Citation
200
 
201
- *Paper coming soon 😊.*
202
 
203
  ## License
204
 
 
10
 
11
  **Falcon-7B is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b/blob/main/LICENSE.txt).**
12
 
13
+ *Paper coming soon* 😊.
14
 
15
  ## Why use Falcon-7B?
16
 
 
198
 
199
  ## Citation
200
 
201
+ *Paper coming soon* 😊.
202
 
203
  ## License
204