Datasets:
File size: 1,782 Bytes
49764b2 0fe9eef 49764b2 0fe9eef 49764b2 61387a1 0fe9eef 61387a1 0fe9eef da8e782 49764b2 8c0649d f3e0240 b79d581 f3e0240 e71fd87 b79d581 8c0649d 69bf7ae 8c0649d 6f8a934 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 |
---
dataset_info:
features:
- name: label_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 23202578
num_examples: 67349
- name: validation
num_bytes: 334716
num_examples: 872
download_size: 4418625
dataset_size: 23537294
task_categories:
- text-classification
language:
- en
---
# Dataset Card for "llama2-sst2-finetuning"
## Dataset Description
The Llama2-sst2-fine-tuning dataset is designed for supervised fine-tuning of the LLaMA V2 based on the GLUE SST2 for sentiment analysis classification task.
We provide two subsets: training and validation.
To ensure the effectiveness of fine-tuning, we convert the data into the prompt template for LLaMA V2 supervised fine-tuning, where the data will follow this format:
```
<s>[INST] <<SYS>>
{System prompt}
<</SYS>>
{User prompt} [/INST] {Label} </s>.
```
The feasibility of this dataset has been tested in supervised fine-tuning on the meta-llama/Llama-2-7b-hf model.
Note. For the sake of simplicity, we have retained only one new column of data ('text').
## Other Useful Links
- [Get Llama 2 Prompt Format Right](https://www.reddit.com/r/LocalLLaMA/comments/155po2p/get_llama_2_prompt_format_right/)
- [Fine-Tune Your Own Llama 2 Model in a Colab Notebook](https://towardsdatascience.com/fine-tune-your-own-llama-2-model-in-a-colab-notebook-df9823a04a32)
- [Instruction fine-tuning Llama 2 with PEFT’s QLoRa method](https://medium.com/@ud.chandra/instruction-fine-tuning-llama-2-with-pefts-qlora-method-d6a801ebb19)
- [GLUE SST2 Dataset](https://www.tensorflow.org/datasets/catalog/glue#gluesst2)
<!--[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)--> |