Update README.md
Browse files
README.md
CHANGED
@@ -14,12 +14,27 @@ of labels that yielded better predictions (see notebook [here](https://www.kaggl
|
|
14 |
|
15 |
* issue
|
16 |
* feature request
|
17 |
-
* question
|
18 |
|
19 |
|
20 |
## Training data
|
21 |
|
22 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
23 |
|
24 |
## Acknowledgements
|
25 |
|
|
|
14 |
|
15 |
* issue
|
16 |
* feature request
|
17 |
+
* question
|
18 |
|
19 |
|
20 |
## Training data
|
21 |
|
22 |
+
* 15k of Github issues titles ("unlabeled_titles_simple.txt")
|
23 |
+
* Hypothesis used: "This request is a {}"
|
24 |
+
* Teacher model used: valhalla/distilbart-mnli-12-1
|
25 |
+
* Studend model used: distilbert-base-uncased
|
26 |
+
|
27 |
+
## Results
|
28 |
+
|
29 |
+
Agreement of student and teacher predictions: **94.82%**
|
30 |
+
|
31 |
+
See [this notebook](https://www.kaggle.com/code/antoinemacia/zero-shot-classifier-for-bug-analysis/edit) for more info on feature engineering choice made
|
32 |
+
|
33 |
+
|
34 |
+
## How to train using your own dataset
|
35 |
+
* Download training dataset from https://www.kaggle.com/datasets/anmolkumar/github-bugs-prediction
|
36 |
+
* Modify and run convert.py, updating the paths to convert to a CSV
|
37 |
+
* Run distill.py with the csv file (see [here](https://github.com/huggingface/transformers/tree/main/examples/research_projects/zero-shot-distillation) for more info)
|
38 |
|
39 |
## Acknowledgements
|
40 |
|