AnkitAI commited on
Commit
9a1fb61
β€’
1 Parent(s): 2c6ce03

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +46 -52
README.md CHANGED
@@ -14,39 +14,36 @@ tags:
14
  - emotions-classifier
15
  ---
16
 
17
- # 🌟 Fast Emotion-X: Fine-tuned DeBERTa V3 Small Based Emotion Detection 🌟
18
 
19
- This is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) for emotion detection on the [dair-ai/emotion](https://huggingface.co/dair-ai/emotion) dataset.
20
 
21
- ## πŸš€ Overview
22
 
23
- Fast Emotion-X is a state-of-the-art emotion detection model fine-tuned from Microsoft's DeBERTa V3 Small model. Designed to accurately classify text into one of six emotional categories, Fast Emotion-X leverages the robust capabilities of DeBERTa and fine-tunes it on a comprehensive emotion dataset, ensuring high accuracy and reliability.
24
 
25
- ## πŸ“œ Model Details
26
 
27
- - **πŸ†• Model Name:** `AnkitAI/deberta-v3-small-base-emotions-classifier`
28
- - **πŸ”— Base Model:** `microsoft/deberta-v3-small`
29
- - **πŸ“Š Dataset:** [dair-ai/emotion](https://huggingface.co/dair-ai/emotion)
30
- - **βš™οΈ Fine-tuning:** This model was fine-tuned for emotion detection with a classification head for six emotional categories (anger, disgust, fear, joy, sadness, surprise).
31
 
 
 
 
 
 
 
 
32
 
33
- ## πŸ“ Emotion Labels
34
- - 😠 Anger
35
- - 🀒 Disgust
36
- - 😨 Fear
37
- - 😊 Joy
38
- - 😒 Sadness
39
- - 😲 Surprise
40
-
41
-
42
- ## πŸš€ Usage
43
-
44
- You can use this model directly with python package or the Hugging Face `transformers` library:
45
 
 
46
 
47
  ### Installation
48
 
49
- You can install the package using pip:
50
 
51
  ```bash
52
  pip install emotionclassifier
@@ -90,7 +87,7 @@ result = classifier.predict("I am very happy today!")
90
  plot_emotion_distribution(result['probabilities'], classifier.labels.values())
91
  ```
92
 
93
- ### CLI Usage
94
 
95
  You can also use the package from the command line:
96
 
@@ -135,7 +132,7 @@ Fine-tune a pre-trained model on your own dataset:
135
  ```python
136
  from emotionclassifier.fine_tune import fine_tune_model
137
 
138
- # Define your train and validation datasets
139
  train_dataset = ...
140
  val_dataset = ...
141
 
@@ -143,7 +140,7 @@ val_dataset = ...
143
  fine_tune_model(classifier.model, classifier.tokenizer, train_dataset, val_dataset, output_dir='fine_tuned_model')
144
  ```
145
 
146
- ### Using transformers
147
 
148
  ```python
149
  from transformers import AutoModelForSequenceClassification, AutoTokenizer
@@ -165,42 +162,39 @@ emotion = predict_emotion(text)
165
  print("Detected Emotion:", emotion)
166
  ```
167
 
168
- ## πŸ‹οΈ Training
169
 
170
  The model was trained using the following parameters:
171
 
172
- - **πŸ”§ Learning Rate:** 2e-5
173
- - **πŸ“¦ Batch Size:** 4
174
- - **βš–οΈ Weight Decay:** 0.01
175
- - **πŸ“… Evaluation Strategy:** Epoch
176
 
177
- ### πŸ‹οΈ Training Details
178
 
179
- - **πŸ“‰ Eval Loss:** 0.0858
180
- - **⏱️ Eval Runtime:** 110070.6349 seconds
181
- - **πŸ“ˆ Eval Samples/Second:** 78.495
182
- - **πŸŒ€ Eval Steps/Second:** 2.453
183
- - **πŸ“‰ Train Loss:** 0.1049
184
- - **⏳ Eval Accuracy:** 94.6%
185
- - **πŸŒ€ Eval Precision:** 94.8%
186
- - **⏱️ Eval Recall:** 94.5%
187
- - **πŸ“ˆ Eval F1 Score:** 94.7%
188
 
 
189
 
190
-
191
- ## πŸ“œ Model Card Data
192
-
193
- | Parameter | Value |
194
- |-------------------------------|---------------------------|
195
  | Model Name | microsoft/deberta-v3-small |
196
- | Training Dataset | dair-ai/emotion |
197
  | Number of Training Epochs | 20 |
198
- | Learning Rate | 2e-5 |
199
- | Per Device Train Batch Size | 4 |
200
- | Evaluation Strategy | Epoch |
201
- | Best Model Accuracy | 94.6% |
202
-
203
 
204
- ## πŸ“œ License
205
 
206
  This model is licensed under the [MIT License](LICENSE).
 
14
  - emotions-classifier
15
  ---
16
 
17
+ # Fast Emotion-X: Fine-tuned DeBERTa V3 Small Based Emotion Detection
18
 
19
+ This model is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) for emotion detection using the [dair-ai/emotion](https://huggingface.co/dair-ai/emotion) dataset.
20
 
21
+ ## Overview
22
 
23
+ Fast Emotion-X is a state-of-the-art emotion detection model fine-tuned from Microsoft's DeBERTa V3 Small model. It is designed to accurately classify text into one of six emotional categories. Leveraging the robust capabilities of DeBERTa, this model is fine-tuned on a comprehensive emotion dataset, ensuring high accuracy and reliability.
24
 
25
+ ## Model Details
26
 
27
+ - **Model Name:** `AnkitAI/deberta-v3-small-base-emotions-classifier`
28
+ - **Base Model:** `microsoft/deberta-v3-small`
29
+ - **Dataset:** [dair-ai/emotion](https://huggingface.co/dair-ai/emotion)
30
+ - **Fine-tuning:** The model is fine-tuned for emotion detection with a classification head for six emotional categories: anger, disgust, fear, joy, sadness, and surprise.
31
 
32
+ ## Emotion Labels
33
+ - Anger
34
+ - Disgust
35
+ - Fear
36
+ - Joy
37
+ - Sadness
38
+ - Surprise
39
 
40
+ ## Usage
 
 
 
 
 
 
 
 
 
 
 
41
 
42
+ You can use this model directly with the provided Python package or the Hugging Face `transformers` library.
43
 
44
  ### Installation
45
 
46
+ Install the package using pip:
47
 
48
  ```bash
49
  pip install emotionclassifier
 
87
  plot_emotion_distribution(result['probabilities'], classifier.labels.values())
88
  ```
89
 
90
+ ### Command-Line Interface (CLI) Usage
91
 
92
  You can also use the package from the command line:
93
 
 
132
  ```python
133
  from emotionclassifier.fine_tune import fine_tune_model
134
 
135
+ # Define your training and validation datasets
136
  train_dataset = ...
137
  val_dataset = ...
138
 
 
140
  fine_tune_model(classifier.model, classifier.tokenizer, train_dataset, val_dataset, output_dir='fine_tuned_model')
141
  ```
142
 
143
+ ### Using transformers Library
144
 
145
  ```python
146
  from transformers import AutoModelForSequenceClassification, AutoTokenizer
 
162
  print("Detected Emotion:", emotion)
163
  ```
164
 
165
+ ## Training
166
 
167
  The model was trained using the following parameters:
168
 
169
+ - **Learning Rate:** 2e-5
170
+ - **Batch Size:** 4
171
+ - **Weight Decay:** 0.01
172
+ - **Evaluation Strategy:** Epoch
173
 
174
+ ### Training Details
175
 
176
+ - **Evaluation Loss:** 0.0858
177
+ - **Evaluation Runtime:** 110070.6349 seconds
178
+ - **Evaluation Samples/Second:** 78.495
179
+ - **Evaluation Steps/Second:** 2.453
180
+ - **Training Loss:** 0.1049
181
+ - **Evaluation Accuracy:** 94.6%
182
+ - **Evaluation Precision:** 94.8%
183
+ - **Evaluation Recall:** 94.5%
184
+ - **Evaluation F1 Score:** 94.7%
185
 
186
+ ## Model Card Data
187
 
188
+ | Parameter | Value |
189
+ |-------------------------------|----------------------------|
 
 
 
190
  | Model Name | microsoft/deberta-v3-small |
191
+ | Training Dataset | dair-ai/emotion |
192
  | Number of Training Epochs | 20 |
193
+ | Learning Rate | 2e-5 |
194
+ | Per Device Train Batch Size | 4 |
195
+ | Evaluation Strategy | Epoch |
196
+ | Best Model Accuracy | 94.6% |
 
197
 
198
+ ## License
199
 
200
  This model is licensed under the [MIT License](LICENSE).