winglian commited on
Commit
a27f8df
1 Parent(s): b5f6dc2

Upload folder using huggingface_hub

Browse files
LICENSE ADDED
@@ -0,0 +1,117 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ META LLAMA 3 COMMUNITY LICENSE AGREEMENT
2
+ Meta Llama 3 Version Release Date: April 18, 2024
3
+
4
+ “Agreement” means the terms and conditions for use, reproduction, distribution and modification of the
5
+ Llama Materials set forth herein.
6
+
7
+ “Documentation” means the specifications, manuals and documentation accompanying Meta Llama 3
8
+ distributed by Meta at https://llama.meta.com/get-started/.
9
+
10
+ “Licensee” or “you” means you, or your employer or any other person or entity (if you are entering into
11
+ this Agreement on such person or entity’s behalf), of the age required under applicable laws, rules or
12
+ regulations to provide legal consent and that has legal authority to bind your employer or such other
13
+ person or entity if you are entering in this Agreement on their behalf.
14
+
15
+ “Meta Llama 3” means the foundational large language models and software and algorithms, including
16
+ machine-learning model code, trained model weights, inference-enabling code, training-enabling code,
17
+ fine-tuning enabling code and other elements of the foregoing distributed by Meta at
18
+ https://llama.meta.com/llama-downloads.
19
+
20
+ “Llama Materials” means, collectively, Meta’s proprietary Meta Llama 3 and Documentation (and any
21
+ portion thereof) made available under this Agreement.
22
+
23
+ “Meta” or “we” means Meta Platforms Ireland Limited (if you are located in or, if you are an entity, your
24
+ principal place of business is in the EEA or Switzerland) and Meta Platforms, Inc. (if you are located
25
+ outside of the EEA or Switzerland).
26
+
27
+ By clicking “I Accept” below or by using or distributing any portion or element of the Llama Materials,
28
+ you agree to be bound by this Agreement.
29
+
30
+ 1. License Rights and Redistribution.
31
+
32
+ a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free
33
+ limited license under Meta’s intellectual property or other rights owned by Meta embodied in the Llama
34
+ Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the
35
+ Llama Materials.
36
+
37
+ b. Redistribution and Use.
38
+
39
+ i. If you distribute or make available the Llama Materials (or any derivative works
40
+ thereof), or a product or service that uses any of them, including another AI model, you shall (A) provide
41
+ a copy of this Agreement with any such Llama Materials; and (B) prominently display “Built with Meta
42
+ Llama 3” on a related website, user interface, blogpost, about page, or product documentation. If you
43
+ use the Llama Materials to create, train, fine tune, or otherwise improve an AI model, which is
44
+ distributed or made available, you shall also include “Llama 3” at the beginning of any such AI model
45
+ name.
46
+
47
+ ii. If you receive Llama Materials, or any derivative works thereof, from a Licensee as part
48
+ of an integrated end user product, then Section 2 of this Agreement will not apply to you.
49
+
50
+ iii. You must retain in all copies of the Llama Materials that you distribute the following
51
+ attribution notice within a “Notice” text file distributed as a part of such copies: “Meta Llama 3 is
52
+ licensed under the Meta Llama 3 Community License, Copyright © Meta Platforms, Inc. All Rights
53
+ Reserved.”
54
+
55
+ iv. Your use of the Llama Materials must comply with applicable laws and regulations
56
+ (including trade compliance laws and regulations) and adhere to the Acceptable Use Policy for the Llama
57
+ Materials (available at https://llama.meta.com/llama3/use-policy), which is hereby incorporated by
58
+ reference into this Agreement.
59
+
60
+ v. You will not use the Llama Materials or any output or results of the Llama Materials to
61
+ improve any other large language model (excluding Meta Llama 3 or derivative works thereof).
62
+
63
+ 2. Additional Commercial Terms. If, on the Meta Llama 3 version release date, the monthly active users
64
+ of the products or services made available by or for Licensee, or Licensee’s affiliates, is greater than 700
65
+ million monthly active users in the preceding calendar month, you must request a license from Meta,
66
+ which Meta may grant to you in its sole discretion, and you are not authorized to exercise any of the
67
+ rights under this Agreement unless or until Meta otherwise expressly grants you such rights.
68
+
69
+ 3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY
70
+ OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN “AS IS” BASIS, WITHOUT WARRANTIES OF
71
+ ANY KIND, AND META DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED,
72
+ INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT,
73
+ MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR
74
+ DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS AND
75
+ ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE LLAMA MATERIALS AND ANY OUTPUT AND
76
+ RESULTS.
77
+
78
+ 4. Limitation of Liability. IN NO EVENT WILL META OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF
79
+ LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING
80
+ OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL,
81
+ INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF META OR ITS AFFILIATES HAVE BEEN ADVISED
82
+ OF THE POSSIBILITY OF ANY OF THE FOREGOING.
83
+
84
+ 5. Intellectual Property.
85
+
86
+ a. No trademark licenses are granted under this Agreement, and in connection with the Llama
87
+ Materials, neither Meta nor Licensee may use any name or mark owned by or associated with the other
88
+ or any of its affiliates, except as required for reasonable and customary use in describing and
89
+ redistributing the Llama Materials or as set forth in this Section 5(a). Meta hereby grants you a license to
90
+ use “Llama 3” (the “Mark”) solely as required to comply with the last sentence of Section 1.b.i. You will
91
+ comply with Meta’s brand guidelines (currently accessible at
92
+ https://about.meta.com/brand/resources/meta/company-brand/ ). All goodwill arising out of your use
93
+ of the Mark will inure to the benefit of Meta.
94
+
95
+ b. Subject to Meta’s ownership of Llama Materials and derivatives made by or for Meta, with
96
+ respect to any derivative works and modifications of the Llama Materials that are made by you, as
97
+ between you and Meta, you are and will be the owner of such derivative works and modifications.
98
+
99
+ c. If you institute litigation or other proceedings against Meta or any entity (including a
100
+ cross-claim or counterclaim in a lawsuit) alleging that the Llama Materials or Meta Llama 3 outputs or
101
+ results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other
102
+ rights owned or licensable by you, then any licenses granted to you under this Agreement shall
103
+ terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold
104
+ harmless Meta from and against any claim by any third party arising out of or related to your use or
105
+ distribution of the Llama Materials.
106
+
107
+ 6. Term and Termination. The term of this Agreement will commence upon your acceptance of this
108
+ Agreement or access to the Llama Materials and will continue in full force and effect until terminated in
109
+ accordance with the terms and conditions herein. Meta may terminate this Agreement if you are in
110
+ breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete
111
+ and cease use of the Llama Materials. Sections 3, 4 and 7 shall survive the termination of this
112
+ Agreement.
113
+
114
+ 7. Governing Law and Jurisdiction. This Agreement will be governed and construed under the laws of
115
+ the State of California without regard to choice of law principles, and the UN Convention on Contracts
116
+ for the International Sale of Goods does not apply to this Agreement. The courts of California shall have
117
+ exclusive jurisdiction of any dispute arising out of this Agreement.
USE_POLICY.md ADDED
@@ -0,0 +1,53 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Meta Llama 3 Acceptable Use Policy
2
+
3
+ Meta is committed to promoting safe and fair use of its tools and features, including Meta Llama 3. If you
4
+ access or use Meta Llama 3, you agree to this Acceptable Use Policy (“Policy”). The most recent copy of
5
+ this policy can be found at [https://llama.meta.com/llama3/use-policy](https://llama.meta.com/llama3/use-policy)
6
+
7
+ ## Prohibited Uses
8
+
9
+ We want everyone to use Meta Llama 3 safely and responsibly. You agree you will not use, or allow
10
+ others to use, Meta Llama 3 to:
11
+
12
+ 1. Violate the law or others’ rights, including to:
13
+ 1. Engage in, promote, generate, contribute to, encourage, plan, incite, or further illegal or unlawful activity or content, such as:
14
+ 1. Violence or terrorism
15
+ 2. Exploitation or harm to children, including the solicitation, creation, acquisition, or dissemination of child exploitative content or failure to report Child Sexual Abuse Material
16
+ 3. Human trafficking, exploitation, and sexual violence
17
+ 4. The illegal distribution of information or materials to minors, including obscene materials, or failure to employ legally required age-gating in connection with such information or materials.
18
+ 5. Sexual solicitation
19
+ 6. Any other criminal activity
20
+ 2. Engage in, promote, incite, or facilitate the harassment, abuse, threatening, or bullying of individuals or groups of individuals
21
+ 3. Engage in, promote, incite, or facilitate discrimination or other unlawful or harmful conduct in the provision of employment, employment benefits, credit, housing, other economic benefits, or other essential goods and services
22
+ 4. Engage in the unauthorized or unlicensed practice of any profession including, but not limited to, financial, legal, medical/health, or related professional practices
23
+ 5. Collect, process, disclose, generate, or infer health, demographic, or other sensitive personal or private information about individuals without rights and consents required by applicable laws
24
+ 6. Engage in or facilitate any action or generate any content that infringes, misappropriates, or otherwise violates any third-party rights, including the outputs or results of any products or services using the Llama Materials
25
+ 7. Create, generate, or facilitate the creation of malicious code, malware, computer viruses or do anything else that could disable, overburden, interfere with or impair the proper working, integrity, operation or appearance of a website or computer system
26
+
27
+ 2. Engage in, promote, incite, facilitate, or assist in the planning or development of activities that present a risk of death or bodily harm to individuals, including use of Meta Llama 3 related to the following:
28
+ 1. Military, warfare, nuclear industries or applications, espionage, use for materials or activities that are subject to the International Traffic Arms Regulations (ITAR) maintained by the United States Department of State
29
+ 2. Guns and illegal weapons (including weapon development)
30
+ 3. Illegal drugs and regulated/controlled substances
31
+ 4. Operation of critical infrastructure, transportation technologies, or heavy machinery
32
+ 5. Self-harm or harm to others, including suicide, cutting, and eating disorders
33
+ 6. Any content intended to incite or promote violence, abuse, or any infliction of bodily harm to an individual
34
+
35
+ 3. Intentionally deceive or mislead others, including use of Meta Llama 3 related to the following:
36
+ 1. Generating, promoting, or furthering fraud or the creation or promotion of disinformation
37
+ 2. Generating, promoting, or furthering defamatory content, including the creation of defamatory statements, images, or other content
38
+ 3. Generating, promoting, or further distributing spam
39
+ 4. Impersonating another individual without consent, authorization, or legal right
40
+ 5. Representing that the use of Meta Llama 3 or outputs are human-generated
41
+ 6. Generating or facilitating false online engagement, including fake reviews and other means of fake online engagement
42
+
43
+ 4. Fail to appropriately disclose to end users any known dangers of your AI system
44
+
45
+ Please report any violation of this Policy, software “bug,” or other problems that could lead to a violation
46
+ of this Policy through one of the following means:
47
+
48
+ ● Reporting issues with the model: [https://github.com/meta-llama/llama3](https://github.com/meta-llama/llama3)
49
+ ● Reporting risky content generated by the model:
50
+ developers.facebook.com/llama_output_feedback
51
+ ● Reporting bugs and security concerns: facebook.com/whitehat/info
52
+ ● Reporting violations of the Acceptable Use Policy or unlicensed uses of Meta Llama 3:
53
adapters/lora-kv/adapter_config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": null,
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": false,
8
+ "init_lora_weights": true,
9
+ "layer_replication": null,
10
+ "layers_pattern": null,
11
+ "layers_to_transform": null,
12
+ "loftq_config": {},
13
+ "lora_alpha": 1024,
14
+ "lora_dropout": 0,
15
+ "megatron_config": null,
16
+ "megatron_core": "megatron.core",
17
+ "modules_to_save": null,
18
+ "peft_type": "LORA",
19
+ "r": 1024,
20
+ "rank_pattern": {},
21
+ "revision": null,
22
+ "target_modules": [
23
+ "v_proj",
24
+ "k_proj"
25
+ ],
26
+ "task_type": "CAUSAL_LM",
27
+ "use_dora": false,
28
+ "use_rslora": false
29
+ }
adapters/lora-kv/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1cae881cc37c4c4b5d4dd7a02eeed3f0e77590eda46e249bc12225851aff488f
3
+ size 1342195088
adapters/lora-mlp/adapter_config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": null,
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": false,
8
+ "init_lora_weights": true,
9
+ "layer_replication": null,
10
+ "layers_pattern": null,
11
+ "layers_to_transform": null,
12
+ "loftq_config": {},
13
+ "lora_alpha": 4096,
14
+ "lora_dropout": 0,
15
+ "megatron_config": null,
16
+ "megatron_core": "megatron.core",
17
+ "modules_to_save": null,
18
+ "peft_type": "LORA",
19
+ "r": 4096,
20
+ "rank_pattern": {},
21
+ "revision": null,
22
+ "target_modules": [
23
+ "down_proj",
24
+ "up_proj",
25
+ "gate_proj"
26
+ ],
27
+ "task_type": "CAUSAL_LM",
28
+ "use_dora": false,
29
+ "use_rslora": false
30
+ }
adapters/lora-mlp/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:de571fcb4bd940b0f3db4cb3ca8f295eb74effb5faaba37098f70a7e78fab25a
3
+ size 28991055896
adapters/lora-qo/adapter_config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": null,
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": false,
8
+ "init_lora_weights": true,
9
+ "layer_replication": null,
10
+ "layers_pattern": null,
11
+ "layers_to_transform": null,
12
+ "loftq_config": {},
13
+ "lora_alpha": 3072,
14
+ "lora_dropout": 0,
15
+ "megatron_config": null,
16
+ "megatron_core": "megatron.core",
17
+ "modules_to_save": null,
18
+ "peft_type": "LORA",
19
+ "r": 3072,
20
+ "rank_pattern": {},
21
+ "revision": null,
22
+ "target_modules": [
23
+ "q_proj",
24
+ "o_proj"
25
+ ],
26
+ "task_type": "CAUSAL_LM",
27
+ "use_dora": false,
28
+ "use_rslora": false
29
+ }
adapters/lora-qo/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:87e76d03ca4ff981b5682045d98df83512fbf86393bc14c4ce2a7321c7f4f09e
3
+ size 6442468912
config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "meta-llama/Meta-Llama-3-8B",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 128000,
9
+ "eos_token_id": 128001,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 4096,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 14336,
14
+ "max_position_embeddings": 1048576,
15
+ "model_type": "llama",
16
+ "num_attention_heads": 32,
17
+ "num_hidden_layers": 32,
18
+ "num_key_value_heads": 8,
19
+ "pretraining_tp": 1,
20
+ "rms_norm_eps": 1e-05,
21
+ "rope_scaling": null,
22
+ "rope_theta": 2804339835.0,
23
+ "tie_word_embeddings": false,
24
+ "torch_dtype": "bfloat16",
25
+ "transformers_version": "4.40.1",
26
+ "use_cache": true,
27
+ "vocab_size": 128256
28
+ }
generation_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 128000,
3
+ "do_sample": true,
4
+ "eos_token_id": 128001,
5
+ "max_length": 4096,
6
+ "temperature": 0.6,
7
+ "top_p": 0.9,
8
+ "transformers_version": "4.40.1"
9
+ }
model-00001-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3e45bb81cc3470535e404f70de2ce57b9546226f297c74fc2e756d8c7d0d6060
3
+ size 4959838400
model-00002-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:72106403f5fe175abcda6328a2018ee0f337ad01a4e41dd5111eccbea4f2fdca
3
+ size 4982941688
model-00003-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1aebfb384603cbbb99a16ef74df85766d36b71a13370225123a7b76dca4915f9
3
+ size 4949371064
model-00004-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:98b91a9eb0b942de4b114f0bd13e149ed43af9271665e436c78885c2416f9117
3
+ size 4915833128
model-00005-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f217ab2434e9eeaf14035bafb1b0384cc69e8e1d6912e8eb69fe8db26e7f9e75
3
+ size 4982925152
model-00006-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a46870467cb854e200fe225872517cd19d8709aa47934e38695354bd2d5163c4
3
+ size 4714514216
model-00007-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7c5394fe848f6eafef2a44f085e7b72e5ef0d315f98aa87b3827c2b1e6983b89
3
+ size 1050673280
model.safetensors.index.json ADDED
@@ -0,0 +1,490 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 30556037120
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00007-of-00007.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00007.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00007.safetensors",
9
+ "model.layers.0.mlp.down_proj.base_layer.weight": "model-00001-of-00007.safetensors",
10
+ "model.layers.0.mlp.down_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
11
+ "model.layers.0.mlp.down_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
12
+ "model.layers.0.mlp.gate_proj.base_layer.weight": "model-00001-of-00007.safetensors",
13
+ "model.layers.0.mlp.gate_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
14
+ "model.layers.0.mlp.gate_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
15
+ "model.layers.0.mlp.up_proj.base_layer.weight": "model-00001-of-00007.safetensors",
16
+ "model.layers.0.mlp.up_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
17
+ "model.layers.0.mlp.up_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
18
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00007.safetensors",
19
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00007.safetensors",
20
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00007.safetensors",
21
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00007.safetensors",
22
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00007.safetensors",
23
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00007.safetensors",
24
+ "model.layers.1.mlp.down_proj.base_layer.weight": "model-00001-of-00007.safetensors",
25
+ "model.layers.1.mlp.down_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
26
+ "model.layers.1.mlp.down_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
27
+ "model.layers.1.mlp.gate_proj.base_layer.weight": "model-00001-of-00007.safetensors",
28
+ "model.layers.1.mlp.gate_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
29
+ "model.layers.1.mlp.gate_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
30
+ "model.layers.1.mlp.up_proj.base_layer.weight": "model-00001-of-00007.safetensors",
31
+ "model.layers.1.mlp.up_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
32
+ "model.layers.1.mlp.up_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
33
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00007.safetensors",
34
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00007.safetensors",
35
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00007.safetensors",
36
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00007.safetensors",
37
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00007.safetensors",
38
+ "model.layers.10.input_layernorm.weight": "model-00003-of-00007.safetensors",
39
+ "model.layers.10.mlp.down_proj.base_layer.weight": "model-00003-of-00007.safetensors",
40
+ "model.layers.10.mlp.down_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
41
+ "model.layers.10.mlp.down_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
42
+ "model.layers.10.mlp.gate_proj.base_layer.weight": "model-00003-of-00007.safetensors",
43
+ "model.layers.10.mlp.gate_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
44
+ "model.layers.10.mlp.gate_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
45
+ "model.layers.10.mlp.up_proj.base_layer.weight": "model-00003-of-00007.safetensors",
46
+ "model.layers.10.mlp.up_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
47
+ "model.layers.10.mlp.up_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
48
+ "model.layers.10.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
49
+ "model.layers.10.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
50
+ "model.layers.10.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
51
+ "model.layers.10.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
52
+ "model.layers.10.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
53
+ "model.layers.11.input_layernorm.weight": "model-00003-of-00007.safetensors",
54
+ "model.layers.11.mlp.down_proj.base_layer.weight": "model-00003-of-00007.safetensors",
55
+ "model.layers.11.mlp.down_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
56
+ "model.layers.11.mlp.down_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
57
+ "model.layers.11.mlp.gate_proj.base_layer.weight": "model-00003-of-00007.safetensors",
58
+ "model.layers.11.mlp.gate_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
59
+ "model.layers.11.mlp.gate_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
60
+ "model.layers.11.mlp.up_proj.base_layer.weight": "model-00003-of-00007.safetensors",
61
+ "model.layers.11.mlp.up_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
62
+ "model.layers.11.mlp.up_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
63
+ "model.layers.11.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
64
+ "model.layers.11.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
65
+ "model.layers.11.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
66
+ "model.layers.11.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
67
+ "model.layers.11.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
68
+ "model.layers.12.input_layernorm.weight": "model-00003-of-00007.safetensors",
69
+ "model.layers.12.mlp.down_proj.base_layer.weight": "model-00003-of-00007.safetensors",
70
+ "model.layers.12.mlp.down_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
71
+ "model.layers.12.mlp.down_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
72
+ "model.layers.12.mlp.gate_proj.base_layer.weight": "model-00003-of-00007.safetensors",
73
+ "model.layers.12.mlp.gate_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
74
+ "model.layers.12.mlp.gate_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
75
+ "model.layers.12.mlp.up_proj.base_layer.weight": "model-00003-of-00007.safetensors",
76
+ "model.layers.12.mlp.up_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
77
+ "model.layers.12.mlp.up_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
78
+ "model.layers.12.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
79
+ "model.layers.12.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
80
+ "model.layers.12.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
81
+ "model.layers.12.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
82
+ "model.layers.12.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
83
+ "model.layers.13.input_layernorm.weight": "model-00003-of-00007.safetensors",
84
+ "model.layers.13.mlp.down_proj.base_layer.weight": "model-00003-of-00007.safetensors",
85
+ "model.layers.13.mlp.down_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
86
+ "model.layers.13.mlp.down_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
87
+ "model.layers.13.mlp.gate_proj.base_layer.weight": "model-00003-of-00007.safetensors",
88
+ "model.layers.13.mlp.gate_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
89
+ "model.layers.13.mlp.gate_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
90
+ "model.layers.13.mlp.up_proj.base_layer.weight": "model-00003-of-00007.safetensors",
91
+ "model.layers.13.mlp.up_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
92
+ "model.layers.13.mlp.up_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
93
+ "model.layers.13.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
94
+ "model.layers.13.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
95
+ "model.layers.13.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
96
+ "model.layers.13.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
97
+ "model.layers.13.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
98
+ "model.layers.14.input_layernorm.weight": "model-00003-of-00007.safetensors",
99
+ "model.layers.14.mlp.down_proj.base_layer.weight": "model-00003-of-00007.safetensors",
100
+ "model.layers.14.mlp.down_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
101
+ "model.layers.14.mlp.down_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
102
+ "model.layers.14.mlp.gate_proj.base_layer.weight": "model-00003-of-00007.safetensors",
103
+ "model.layers.14.mlp.gate_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
104
+ "model.layers.14.mlp.gate_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
105
+ "model.layers.14.mlp.up_proj.base_layer.weight": "model-00003-of-00007.safetensors",
106
+ "model.layers.14.mlp.up_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
107
+ "model.layers.14.mlp.up_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
108
+ "model.layers.14.post_attention_layernorm.weight": "model-00003-of-00007.safetensors",
109
+ "model.layers.14.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
110
+ "model.layers.14.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
111
+ "model.layers.14.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
112
+ "model.layers.14.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
113
+ "model.layers.15.input_layernorm.weight": "model-00004-of-00007.safetensors",
114
+ "model.layers.15.mlp.down_proj.base_layer.weight": "model-00004-of-00007.safetensors",
115
+ "model.layers.15.mlp.down_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
116
+ "model.layers.15.mlp.down_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
117
+ "model.layers.15.mlp.gate_proj.base_layer.weight": "model-00003-of-00007.safetensors",
118
+ "model.layers.15.mlp.gate_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
119
+ "model.layers.15.mlp.gate_proj.lora_B.default.weight": "model-00003-of-00007.safetensors",
120
+ "model.layers.15.mlp.up_proj.base_layer.weight": "model-00003-of-00007.safetensors",
121
+ "model.layers.15.mlp.up_proj.lora_A.default.weight": "model-00003-of-00007.safetensors",
122
+ "model.layers.15.mlp.up_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
123
+ "model.layers.15.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
124
+ "model.layers.15.self_attn.k_proj.weight": "model-00003-of-00007.safetensors",
125
+ "model.layers.15.self_attn.o_proj.weight": "model-00003-of-00007.safetensors",
126
+ "model.layers.15.self_attn.q_proj.weight": "model-00003-of-00007.safetensors",
127
+ "model.layers.15.self_attn.v_proj.weight": "model-00003-of-00007.safetensors",
128
+ "model.layers.16.input_layernorm.weight": "model-00004-of-00007.safetensors",
129
+ "model.layers.16.mlp.down_proj.base_layer.weight": "model-00004-of-00007.safetensors",
130
+ "model.layers.16.mlp.down_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
131
+ "model.layers.16.mlp.down_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
132
+ "model.layers.16.mlp.gate_proj.base_layer.weight": "model-00004-of-00007.safetensors",
133
+ "model.layers.16.mlp.gate_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
134
+ "model.layers.16.mlp.gate_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
135
+ "model.layers.16.mlp.up_proj.base_layer.weight": "model-00004-of-00007.safetensors",
136
+ "model.layers.16.mlp.up_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
137
+ "model.layers.16.mlp.up_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
138
+ "model.layers.16.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
139
+ "model.layers.16.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
140
+ "model.layers.16.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
141
+ "model.layers.16.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
142
+ "model.layers.16.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
143
+ "model.layers.17.input_layernorm.weight": "model-00004-of-00007.safetensors",
144
+ "model.layers.17.mlp.down_proj.base_layer.weight": "model-00004-of-00007.safetensors",
145
+ "model.layers.17.mlp.down_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
146
+ "model.layers.17.mlp.down_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
147
+ "model.layers.17.mlp.gate_proj.base_layer.weight": "model-00004-of-00007.safetensors",
148
+ "model.layers.17.mlp.gate_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
149
+ "model.layers.17.mlp.gate_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
150
+ "model.layers.17.mlp.up_proj.base_layer.weight": "model-00004-of-00007.safetensors",
151
+ "model.layers.17.mlp.up_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
152
+ "model.layers.17.mlp.up_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
153
+ "model.layers.17.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
154
+ "model.layers.17.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
155
+ "model.layers.17.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
156
+ "model.layers.17.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
157
+ "model.layers.17.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
158
+ "model.layers.18.input_layernorm.weight": "model-00004-of-00007.safetensors",
159
+ "model.layers.18.mlp.down_proj.base_layer.weight": "model-00004-of-00007.safetensors",
160
+ "model.layers.18.mlp.down_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
161
+ "model.layers.18.mlp.down_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
162
+ "model.layers.18.mlp.gate_proj.base_layer.weight": "model-00004-of-00007.safetensors",
163
+ "model.layers.18.mlp.gate_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
164
+ "model.layers.18.mlp.gate_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
165
+ "model.layers.18.mlp.up_proj.base_layer.weight": "model-00004-of-00007.safetensors",
166
+ "model.layers.18.mlp.up_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
167
+ "model.layers.18.mlp.up_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
168
+ "model.layers.18.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
169
+ "model.layers.18.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
170
+ "model.layers.18.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
171
+ "model.layers.18.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
172
+ "model.layers.18.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
173
+ "model.layers.19.input_layernorm.weight": "model-00004-of-00007.safetensors",
174
+ "model.layers.19.mlp.down_proj.base_layer.weight": "model-00004-of-00007.safetensors",
175
+ "model.layers.19.mlp.down_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
176
+ "model.layers.19.mlp.down_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
177
+ "model.layers.19.mlp.gate_proj.base_layer.weight": "model-00004-of-00007.safetensors",
178
+ "model.layers.19.mlp.gate_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
179
+ "model.layers.19.mlp.gate_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
180
+ "model.layers.19.mlp.up_proj.base_layer.weight": "model-00004-of-00007.safetensors",
181
+ "model.layers.19.mlp.up_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
182
+ "model.layers.19.mlp.up_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
183
+ "model.layers.19.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
184
+ "model.layers.19.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
185
+ "model.layers.19.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
186
+ "model.layers.19.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
187
+ "model.layers.19.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
188
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00007.safetensors",
189
+ "model.layers.2.mlp.down_proj.base_layer.weight": "model-00001-of-00007.safetensors",
190
+ "model.layers.2.mlp.down_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
191
+ "model.layers.2.mlp.down_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
192
+ "model.layers.2.mlp.gate_proj.base_layer.weight": "model-00001-of-00007.safetensors",
193
+ "model.layers.2.mlp.gate_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
194
+ "model.layers.2.mlp.gate_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
195
+ "model.layers.2.mlp.up_proj.base_layer.weight": "model-00001-of-00007.safetensors",
196
+ "model.layers.2.mlp.up_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
197
+ "model.layers.2.mlp.up_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
198
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00007.safetensors",
199
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00007.safetensors",
200
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00007.safetensors",
201
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00007.safetensors",
202
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00007.safetensors",
203
+ "model.layers.20.input_layernorm.weight": "model-00004-of-00007.safetensors",
204
+ "model.layers.20.mlp.down_proj.base_layer.weight": "model-00004-of-00007.safetensors",
205
+ "model.layers.20.mlp.down_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
206
+ "model.layers.20.mlp.down_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
207
+ "model.layers.20.mlp.gate_proj.base_layer.weight": "model-00004-of-00007.safetensors",
208
+ "model.layers.20.mlp.gate_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
209
+ "model.layers.20.mlp.gate_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
210
+ "model.layers.20.mlp.up_proj.base_layer.weight": "model-00004-of-00007.safetensors",
211
+ "model.layers.20.mlp.up_proj.lora_A.default.weight": "model-00004-of-00007.safetensors",
212
+ "model.layers.20.mlp.up_proj.lora_B.default.weight": "model-00004-of-00007.safetensors",
213
+ "model.layers.20.post_attention_layernorm.weight": "model-00004-of-00007.safetensors",
214
+ "model.layers.20.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
215
+ "model.layers.20.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
216
+ "model.layers.20.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
217
+ "model.layers.20.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
218
+ "model.layers.21.input_layernorm.weight": "model-00005-of-00007.safetensors",
219
+ "model.layers.21.mlp.down_proj.base_layer.weight": "model-00005-of-00007.safetensors",
220
+ "model.layers.21.mlp.down_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
221
+ "model.layers.21.mlp.down_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
222
+ "model.layers.21.mlp.gate_proj.base_layer.weight": "model-00005-of-00007.safetensors",
223
+ "model.layers.21.mlp.gate_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
224
+ "model.layers.21.mlp.gate_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
225
+ "model.layers.21.mlp.up_proj.base_layer.weight": "model-00005-of-00007.safetensors",
226
+ "model.layers.21.mlp.up_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
227
+ "model.layers.21.mlp.up_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
228
+ "model.layers.21.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
229
+ "model.layers.21.self_attn.k_proj.weight": "model-00004-of-00007.safetensors",
230
+ "model.layers.21.self_attn.o_proj.weight": "model-00004-of-00007.safetensors",
231
+ "model.layers.21.self_attn.q_proj.weight": "model-00004-of-00007.safetensors",
232
+ "model.layers.21.self_attn.v_proj.weight": "model-00004-of-00007.safetensors",
233
+ "model.layers.22.input_layernorm.weight": "model-00005-of-00007.safetensors",
234
+ "model.layers.22.mlp.down_proj.base_layer.weight": "model-00005-of-00007.safetensors",
235
+ "model.layers.22.mlp.down_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
236
+ "model.layers.22.mlp.down_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
237
+ "model.layers.22.mlp.gate_proj.base_layer.weight": "model-00005-of-00007.safetensors",
238
+ "model.layers.22.mlp.gate_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
239
+ "model.layers.22.mlp.gate_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
240
+ "model.layers.22.mlp.up_proj.base_layer.weight": "model-00005-of-00007.safetensors",
241
+ "model.layers.22.mlp.up_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
242
+ "model.layers.22.mlp.up_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
243
+ "model.layers.22.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
244
+ "model.layers.22.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
245
+ "model.layers.22.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
246
+ "model.layers.22.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
247
+ "model.layers.22.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
248
+ "model.layers.23.input_layernorm.weight": "model-00005-of-00007.safetensors",
249
+ "model.layers.23.mlp.down_proj.base_layer.weight": "model-00005-of-00007.safetensors",
250
+ "model.layers.23.mlp.down_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
251
+ "model.layers.23.mlp.down_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
252
+ "model.layers.23.mlp.gate_proj.base_layer.weight": "model-00005-of-00007.safetensors",
253
+ "model.layers.23.mlp.gate_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
254
+ "model.layers.23.mlp.gate_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
255
+ "model.layers.23.mlp.up_proj.base_layer.weight": "model-00005-of-00007.safetensors",
256
+ "model.layers.23.mlp.up_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
257
+ "model.layers.23.mlp.up_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
258
+ "model.layers.23.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
259
+ "model.layers.23.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
260
+ "model.layers.23.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
261
+ "model.layers.23.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
262
+ "model.layers.23.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
263
+ "model.layers.24.input_layernorm.weight": "model-00005-of-00007.safetensors",
264
+ "model.layers.24.mlp.down_proj.base_layer.weight": "model-00005-of-00007.safetensors",
265
+ "model.layers.24.mlp.down_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
266
+ "model.layers.24.mlp.down_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
267
+ "model.layers.24.mlp.gate_proj.base_layer.weight": "model-00005-of-00007.safetensors",
268
+ "model.layers.24.mlp.gate_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
269
+ "model.layers.24.mlp.gate_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
270
+ "model.layers.24.mlp.up_proj.base_layer.weight": "model-00005-of-00007.safetensors",
271
+ "model.layers.24.mlp.up_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
272
+ "model.layers.24.mlp.up_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
273
+ "model.layers.24.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
274
+ "model.layers.24.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
275
+ "model.layers.24.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
276
+ "model.layers.24.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
277
+ "model.layers.24.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
278
+ "model.layers.25.input_layernorm.weight": "model-00005-of-00007.safetensors",
279
+ "model.layers.25.mlp.down_proj.base_layer.weight": "model-00005-of-00007.safetensors",
280
+ "model.layers.25.mlp.down_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
281
+ "model.layers.25.mlp.down_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
282
+ "model.layers.25.mlp.gate_proj.base_layer.weight": "model-00005-of-00007.safetensors",
283
+ "model.layers.25.mlp.gate_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
284
+ "model.layers.25.mlp.gate_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
285
+ "model.layers.25.mlp.up_proj.base_layer.weight": "model-00005-of-00007.safetensors",
286
+ "model.layers.25.mlp.up_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
287
+ "model.layers.25.mlp.up_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
288
+ "model.layers.25.post_attention_layernorm.weight": "model-00005-of-00007.safetensors",
289
+ "model.layers.25.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
290
+ "model.layers.25.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
291
+ "model.layers.25.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
292
+ "model.layers.25.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
293
+ "model.layers.26.input_layernorm.weight": "model-00006-of-00007.safetensors",
294
+ "model.layers.26.mlp.down_proj.base_layer.weight": "model-00006-of-00007.safetensors",
295
+ "model.layers.26.mlp.down_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
296
+ "model.layers.26.mlp.down_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
297
+ "model.layers.26.mlp.gate_proj.base_layer.weight": "model-00005-of-00007.safetensors",
298
+ "model.layers.26.mlp.gate_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
299
+ "model.layers.26.mlp.gate_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
300
+ "model.layers.26.mlp.up_proj.base_layer.weight": "model-00005-of-00007.safetensors",
301
+ "model.layers.26.mlp.up_proj.lora_A.default.weight": "model-00005-of-00007.safetensors",
302
+ "model.layers.26.mlp.up_proj.lora_B.default.weight": "model-00005-of-00007.safetensors",
303
+ "model.layers.26.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
304
+ "model.layers.26.self_attn.k_proj.weight": "model-00005-of-00007.safetensors",
305
+ "model.layers.26.self_attn.o_proj.weight": "model-00005-of-00007.safetensors",
306
+ "model.layers.26.self_attn.q_proj.weight": "model-00005-of-00007.safetensors",
307
+ "model.layers.26.self_attn.v_proj.weight": "model-00005-of-00007.safetensors",
308
+ "model.layers.27.input_layernorm.weight": "model-00006-of-00007.safetensors",
309
+ "model.layers.27.mlp.down_proj.base_layer.weight": "model-00006-of-00007.safetensors",
310
+ "model.layers.27.mlp.down_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
311
+ "model.layers.27.mlp.down_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
312
+ "model.layers.27.mlp.gate_proj.base_layer.weight": "model-00006-of-00007.safetensors",
313
+ "model.layers.27.mlp.gate_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
314
+ "model.layers.27.mlp.gate_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
315
+ "model.layers.27.mlp.up_proj.base_layer.weight": "model-00006-of-00007.safetensors",
316
+ "model.layers.27.mlp.up_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
317
+ "model.layers.27.mlp.up_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
318
+ "model.layers.27.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
319
+ "model.layers.27.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
320
+ "model.layers.27.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
321
+ "model.layers.27.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
322
+ "model.layers.27.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
323
+ "model.layers.28.input_layernorm.weight": "model-00006-of-00007.safetensors",
324
+ "model.layers.28.mlp.down_proj.base_layer.weight": "model-00006-of-00007.safetensors",
325
+ "model.layers.28.mlp.down_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
326
+ "model.layers.28.mlp.down_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
327
+ "model.layers.28.mlp.gate_proj.base_layer.weight": "model-00006-of-00007.safetensors",
328
+ "model.layers.28.mlp.gate_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
329
+ "model.layers.28.mlp.gate_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
330
+ "model.layers.28.mlp.up_proj.base_layer.weight": "model-00006-of-00007.safetensors",
331
+ "model.layers.28.mlp.up_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
332
+ "model.layers.28.mlp.up_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
333
+ "model.layers.28.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
334
+ "model.layers.28.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
335
+ "model.layers.28.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
336
+ "model.layers.28.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
337
+ "model.layers.28.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
338
+ "model.layers.29.input_layernorm.weight": "model-00006-of-00007.safetensors",
339
+ "model.layers.29.mlp.down_proj.base_layer.weight": "model-00006-of-00007.safetensors",
340
+ "model.layers.29.mlp.down_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
341
+ "model.layers.29.mlp.down_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
342
+ "model.layers.29.mlp.gate_proj.base_layer.weight": "model-00006-of-00007.safetensors",
343
+ "model.layers.29.mlp.gate_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
344
+ "model.layers.29.mlp.gate_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
345
+ "model.layers.29.mlp.up_proj.base_layer.weight": "model-00006-of-00007.safetensors",
346
+ "model.layers.29.mlp.up_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
347
+ "model.layers.29.mlp.up_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
348
+ "model.layers.29.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
349
+ "model.layers.29.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
350
+ "model.layers.29.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
351
+ "model.layers.29.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
352
+ "model.layers.29.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
353
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00007.safetensors",
354
+ "model.layers.3.mlp.down_proj.base_layer.weight": "model-00001-of-00007.safetensors",
355
+ "model.layers.3.mlp.down_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
356
+ "model.layers.3.mlp.down_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
357
+ "model.layers.3.mlp.gate_proj.base_layer.weight": "model-00001-of-00007.safetensors",
358
+ "model.layers.3.mlp.gate_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
359
+ "model.layers.3.mlp.gate_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
360
+ "model.layers.3.mlp.up_proj.base_layer.weight": "model-00001-of-00007.safetensors",
361
+ "model.layers.3.mlp.up_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
362
+ "model.layers.3.mlp.up_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
363
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00007.safetensors",
364
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00007.safetensors",
365
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00007.safetensors",
366
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00007.safetensors",
367
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00007.safetensors",
368
+ "model.layers.30.input_layernorm.weight": "model-00006-of-00007.safetensors",
369
+ "model.layers.30.mlp.down_proj.base_layer.weight": "model-00006-of-00007.safetensors",
370
+ "model.layers.30.mlp.down_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
371
+ "model.layers.30.mlp.down_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
372
+ "model.layers.30.mlp.gate_proj.base_layer.weight": "model-00006-of-00007.safetensors",
373
+ "model.layers.30.mlp.gate_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
374
+ "model.layers.30.mlp.gate_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
375
+ "model.layers.30.mlp.up_proj.base_layer.weight": "model-00006-of-00007.safetensors",
376
+ "model.layers.30.mlp.up_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
377
+ "model.layers.30.mlp.up_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
378
+ "model.layers.30.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
379
+ "model.layers.30.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
380
+ "model.layers.30.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
381
+ "model.layers.30.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
382
+ "model.layers.30.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
383
+ "model.layers.31.input_layernorm.weight": "model-00006-of-00007.safetensors",
384
+ "model.layers.31.mlp.down_proj.base_layer.weight": "model-00006-of-00007.safetensors",
385
+ "model.layers.31.mlp.down_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
386
+ "model.layers.31.mlp.down_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
387
+ "model.layers.31.mlp.gate_proj.base_layer.weight": "model-00006-of-00007.safetensors",
388
+ "model.layers.31.mlp.gate_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
389
+ "model.layers.31.mlp.gate_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
390
+ "model.layers.31.mlp.up_proj.base_layer.weight": "model-00006-of-00007.safetensors",
391
+ "model.layers.31.mlp.up_proj.lora_A.default.weight": "model-00006-of-00007.safetensors",
392
+ "model.layers.31.mlp.up_proj.lora_B.default.weight": "model-00006-of-00007.safetensors",
393
+ "model.layers.31.post_attention_layernorm.weight": "model-00006-of-00007.safetensors",
394
+ "model.layers.31.self_attn.k_proj.weight": "model-00006-of-00007.safetensors",
395
+ "model.layers.31.self_attn.o_proj.weight": "model-00006-of-00007.safetensors",
396
+ "model.layers.31.self_attn.q_proj.weight": "model-00006-of-00007.safetensors",
397
+ "model.layers.31.self_attn.v_proj.weight": "model-00006-of-00007.safetensors",
398
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00007.safetensors",
399
+ "model.layers.4.mlp.down_proj.base_layer.weight": "model-00002-of-00007.safetensors",
400
+ "model.layers.4.mlp.down_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
401
+ "model.layers.4.mlp.down_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
402
+ "model.layers.4.mlp.gate_proj.base_layer.weight": "model-00001-of-00007.safetensors",
403
+ "model.layers.4.mlp.gate_proj.lora_A.default.weight": "model-00001-of-00007.safetensors",
404
+ "model.layers.4.mlp.gate_proj.lora_B.default.weight": "model-00001-of-00007.safetensors",
405
+ "model.layers.4.mlp.up_proj.base_layer.weight": "model-00002-of-00007.safetensors",
406
+ "model.layers.4.mlp.up_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
407
+ "model.layers.4.mlp.up_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
408
+ "model.layers.4.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
409
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00007.safetensors",
410
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00007.safetensors",
411
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00007.safetensors",
412
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00007.safetensors",
413
+ "model.layers.5.input_layernorm.weight": "model-00002-of-00007.safetensors",
414
+ "model.layers.5.mlp.down_proj.base_layer.weight": "model-00002-of-00007.safetensors",
415
+ "model.layers.5.mlp.down_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
416
+ "model.layers.5.mlp.down_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
417
+ "model.layers.5.mlp.gate_proj.base_layer.weight": "model-00002-of-00007.safetensors",
418
+ "model.layers.5.mlp.gate_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
419
+ "model.layers.5.mlp.gate_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
420
+ "model.layers.5.mlp.up_proj.base_layer.weight": "model-00002-of-00007.safetensors",
421
+ "model.layers.5.mlp.up_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
422
+ "model.layers.5.mlp.up_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
423
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
424
+ "model.layers.5.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
425
+ "model.layers.5.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
426
+ "model.layers.5.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
427
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
428
+ "model.layers.6.input_layernorm.weight": "model-00002-of-00007.safetensors",
429
+ "model.layers.6.mlp.down_proj.base_layer.weight": "model-00002-of-00007.safetensors",
430
+ "model.layers.6.mlp.down_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
431
+ "model.layers.6.mlp.down_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
432
+ "model.layers.6.mlp.gate_proj.base_layer.weight": "model-00002-of-00007.safetensors",
433
+ "model.layers.6.mlp.gate_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
434
+ "model.layers.6.mlp.gate_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
435
+ "model.layers.6.mlp.up_proj.base_layer.weight": "model-00002-of-00007.safetensors",
436
+ "model.layers.6.mlp.up_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
437
+ "model.layers.6.mlp.up_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
438
+ "model.layers.6.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
439
+ "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
440
+ "model.layers.6.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
441
+ "model.layers.6.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
442
+ "model.layers.6.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
443
+ "model.layers.7.input_layernorm.weight": "model-00002-of-00007.safetensors",
444
+ "model.layers.7.mlp.down_proj.base_layer.weight": "model-00002-of-00007.safetensors",
445
+ "model.layers.7.mlp.down_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
446
+ "model.layers.7.mlp.down_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
447
+ "model.layers.7.mlp.gate_proj.base_layer.weight": "model-00002-of-00007.safetensors",
448
+ "model.layers.7.mlp.gate_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
449
+ "model.layers.7.mlp.gate_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
450
+ "model.layers.7.mlp.up_proj.base_layer.weight": "model-00002-of-00007.safetensors",
451
+ "model.layers.7.mlp.up_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
452
+ "model.layers.7.mlp.up_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
453
+ "model.layers.7.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
454
+ "model.layers.7.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
455
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
456
+ "model.layers.7.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
457
+ "model.layers.7.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
458
+ "model.layers.8.input_layernorm.weight": "model-00002-of-00007.safetensors",
459
+ "model.layers.8.mlp.down_proj.base_layer.weight": "model-00002-of-00007.safetensors",
460
+ "model.layers.8.mlp.down_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
461
+ "model.layers.8.mlp.down_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
462
+ "model.layers.8.mlp.gate_proj.base_layer.weight": "model-00002-of-00007.safetensors",
463
+ "model.layers.8.mlp.gate_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
464
+ "model.layers.8.mlp.gate_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
465
+ "model.layers.8.mlp.up_proj.base_layer.weight": "model-00002-of-00007.safetensors",
466
+ "model.layers.8.mlp.up_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
467
+ "model.layers.8.mlp.up_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
468
+ "model.layers.8.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
469
+ "model.layers.8.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
470
+ "model.layers.8.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
471
+ "model.layers.8.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
472
+ "model.layers.8.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
473
+ "model.layers.9.input_layernorm.weight": "model-00002-of-00007.safetensors",
474
+ "model.layers.9.mlp.down_proj.base_layer.weight": "model-00002-of-00007.safetensors",
475
+ "model.layers.9.mlp.down_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
476
+ "model.layers.9.mlp.down_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
477
+ "model.layers.9.mlp.gate_proj.base_layer.weight": "model-00002-of-00007.safetensors",
478
+ "model.layers.9.mlp.gate_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
479
+ "model.layers.9.mlp.gate_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
480
+ "model.layers.9.mlp.up_proj.base_layer.weight": "model-00002-of-00007.safetensors",
481
+ "model.layers.9.mlp.up_proj.lora_A.default.weight": "model-00002-of-00007.safetensors",
482
+ "model.layers.9.mlp.up_proj.lora_B.default.weight": "model-00002-of-00007.safetensors",
483
+ "model.layers.9.post_attention_layernorm.weight": "model-00002-of-00007.safetensors",
484
+ "model.layers.9.self_attn.k_proj.weight": "model-00002-of-00007.safetensors",
485
+ "model.layers.9.self_attn.o_proj.weight": "model-00002-of-00007.safetensors",
486
+ "model.layers.9.self_attn.q_proj.weight": "model-00002-of-00007.safetensors",
487
+ "model.layers.9.self_attn.v_proj.weight": "model-00002-of-00007.safetensors",
488
+ "model.norm.weight": "model-00006-of-00007.safetensors"
489
+ }
490
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "bos_token": "<|begin_of_text|>",
3
+ "eos_token": "<|end_of_text|>"
4
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,2062 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|reserved_special_token_0|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|reserved_special_token_1|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<|reserved_special_token_2|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_3|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<|reserved_special_token_4|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "<|reserved_special_token_5|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "128011": {
92
+ "content": "<|reserved_special_token_6|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "128012": {
100
+ "content": "<|reserved_special_token_7|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_8|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_9|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_10|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_11|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_12|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_13|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_14|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_15|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_16|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_17|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_18|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_19|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_20|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_21|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_22|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_23|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_24|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_25|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_26|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_27|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_28|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_29|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_30|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_31|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_32|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_33|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_34|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_35|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_36|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_37|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_38|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_39|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_40|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_41|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_42|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_43|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_44|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_45|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_46|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_47|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_48|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_49|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_50|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_51|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_52|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_53|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_54|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_55|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_56|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_57|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_58|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_59|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_60|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_61|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_62|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_63|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_64|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_65|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_66|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_67|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_68|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_69|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_70|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_71|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_72|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_73|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_74|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_75|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_76|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_77|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_78|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_79|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_80|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_81|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_82|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_83|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_84|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_85|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_86|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_87|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_88|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_89|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_90|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_91|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_92|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_93|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_94|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_95|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_96|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_97|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_98|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_99|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_100|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_101|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_102|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_103|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_104|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_105|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_106|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_107|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_108|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_109|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_110|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_111|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_112|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_113|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_114|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_115|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_116|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_117|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_118|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_119|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_120|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_121|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_122|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_123|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_124|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_125|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_126|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_127|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_128|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_129|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_130|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_131|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_132|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_133|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_134|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_135|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_136|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_137|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_138|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_139|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_140|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_141|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_142|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_143|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_144|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_145|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_146|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_147|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_148|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_149|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_150|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_151|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_152|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_153|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_154|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_155|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_156|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_157|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_158|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_159|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_160|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_161|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_162|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_163|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_164|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_165|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_166|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_167|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_168|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_169|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_170|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_171|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_172|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_173|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_174|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_175|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_176|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_177|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_178|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_179|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_180|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_181|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_182|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_183|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_184|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_185|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_186|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_187|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_188|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_189|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_190|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_191|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_192|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_193|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_194|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_195|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_196|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_197|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_198|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_199|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_200|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_201|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_202|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_203|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_204|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_205|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_206|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_207|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_208|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_209|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_210|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_211|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_212|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_213|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_214|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_215|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_216|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_217|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_218|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_219|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_220|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_221|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_222|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_223|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_224|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_225|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_226|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_227|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_228|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_229|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_230|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_231|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_232|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_233|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_234|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_235|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_236|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_237|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_238|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_239|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_240|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_241|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_242|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_243|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_244|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_245|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_246|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_247|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_248|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_249|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_250|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ }
2051
+ },
2052
+ "bos_token": "<|begin_of_text|>",
2053
+ "chat_template": "{% set loop_messages = messages %}{% for message in loop_messages %}{% set content = '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] | trim + '<|eot_id|>' %}{% if loop.index0 == 0 %}{% set content = bos_token + content %}{% endif %}{{ content }}{% endfor %}{% if add_generation_prompt %}{{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }}{% endif %}",
2054
+ "clean_up_tokenization_spaces": true,
2055
+ "eos_token": "<|end_of_text|>",
2056
+ "model_input_names": [
2057
+ "input_ids",
2058
+ "attention_mask"
2059
+ ],
2060
+ "model_max_length": 1000000000000000019884624838656,
2061
+ "tokenizer_class": "PreTrainedTokenizerFast"
2062
+ }