--- base_model: sentence-transformers/paraphrase-mpnet-base-v2 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: troubleshooting n a test results n a trouble description generator failed to start during blackout test transfer switch died before generator could start transfer switch need repair asap back power need to be wired to transfer switch history of trouble n a vendor acas problem description generator failed to start during blackout test transfer switch died before generator could start transfer switch need repair asap back power need to be wired to transfer switch special access n a - text: 1 gen with oil pressure shutdown alarm 2 genfail alarm is not showing up in site boss requestor banaag rommel requestor email rommel banaag verizonwireless com requestor phone 951 8342458 - text: troubleshooting triage category gen fail site id cvl02692 alarms cvl02692 rbs generator fail fieldreplaceableunit=sau 1 alarmport=12 2024 08 06 06 23 36 med generator verification yes history n a knowledge judgement sending to vendor to check generator dispatch strategy vendor test results triage category gen fail site id cvl02692 alarms cvl02692 rbs generator fail fieldreplaceableunit=sau 1 alarmport=12 2024 08 06 06 23 36 med generator verification yes history n a knowledge judgement sending to vendor to check generator dispatch strategy vendor trouble description rbs generator fail history of trouble triage category gen fail site id cvl02692 alarms cvl02692 rbs generator fail fieldreplaceableunit=sau 1 alarmport=12 2024 08 06 06 23 36 med generator verification yes history n a knowledge judgement sending to vendor to check generator dispatch strategy vendor vendor acas problem description rbs generator fail special access n a - text: troubleshooting triage category rbs generator fuel leak alarm cvl08526 cvl08526 rbs generator fuel leak fieldreplaceableunit=sau 1 alarmport=23 2024 07 10 13 07 38 cvl08526 cvl08526 rbs rbs generator fuel leak fieldreplaceableunit=sau 1 alarmport=20 2024 07 10 13 05 04 mdat oremis verification generator generac baldor magnum sd30 manufacturer generac baldor magnum model sd30 status in use serial 3008406953 kw 30 prime power source no still on site yes engine perkins engine co ltd 404d 22ta manufacturer perkins engine co ltd model 404d 22ta serial gr84695u9967000g max engine kw 36 manufacturered date 2021 02 01 engine type diesel max brake hp 49 in service date 2022 07 13 fuel type ultra low sulfur diesel ulsd owner cell no repeats open related tckt active eim intrusion knowledge judgement sending to vendor to investigate and resolve gen rbs generator fuel leak condition dispatch strategy vendor test results triage category generator rbs generator fuel leak alarm cvl08526 cvl08526 rbs generator fuel leak fieldreplaceableunit=sau 1 alarmport=23 2024 07 10 13 07 38 cvl08526 cvl08526 rbs generator rbs generator fuel leak fieldreplaceableunit=sau 1 alarmport=20 2024 07 10 13 05 04 mdat oremis verification generator generac baldor magnum sd30 manufacturer generac baldor magnum model sd30 status in use serial 3008406953 kw 30 prime power source no still on site yes engine perkins engine co ltd 404d 22ta manufacturer perkins engine co ltd model 404d 22ta serial gr84695u9967000g max engine kw 36 manufacturered date 2021 02 01 engine type diesel max brake hp 49 in service date 2022 07 13 fuel type ultra low sulfur diesel ulsd owner cell no repeats open related tckt active eim intrusion knowledge judgement sending to vendor to investigate and resolve gen rbs generator fuel leak condition dispatch strategy vendor trouble description smart rbs generator fuel leak history of trouble na vendor acas problem description smart rbs generator fuel leak special access na - text: troubleshooting triage category gen fail oss netcool alarms ccl05638 rbs generator fail fieldreplaceableunit=sau 1 alarmport=10 rbs generator fail ca daly city cell site guadalupe canyon parkway 2024 07 29 23 37 56 smart alarm y mdat verification active generac sd030 2022 d 3012298793 fixed in compound history no repeats tab no open related tickets in aots knowledge judgement sending to vendor to check gen fail dispatch strategy vendor test results triage category gen fail oss netcool alarms ccl05638 rbs generator fail fieldreplaceableunit=sau 1 alarmport=10 rbs generator fail ca daly city cell site guadalupe canyon parkway 2024 07 29 23 37 56 smart alarm y mdat verification active generac sd030 2022 d 3012298793 fixed in compound history no repeats tab no open related tickets in aots knowledge judgement sending to vendor to check gen fail dispatch strategy vendor trouble description triage category gen fail oss netcool alarms ccl05638 rbs generator fail fieldreplaceableunit=sau 1 alarmport=10 rbs generator fail ca daly city cell site guadalupe canyon parkway 2024 07 29 23 37 56 smart alarm y mdat verification active generac sd030 2022 d 3012298793 fixed in compound history no repeats tab no open related tickets in aots knowledge judgement sending to vendor to check gen fail dispatch strategy vendor history of trouble n a vendor acas problem description triage category gen fail oss netcool alarms ccl05638 rbs generator fail fieldreplaceableunit=sau 1 alarmport=10 rbs generator fail ca daly city cell site guadalupe canyon parkway 2024 07 29 23 37 56 smart alarm y mdat verification active generac sd030 2022 d 3012298793 fixed in compound history no repeats tab no open related tickets in aots knowledge judgement sending to vendor to check gen fail dispatch strategy vendor special access n a inference: true model-index: - name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 0.6666666666666666 name: Accuracy --- # SetFit with sentence-transformers/paraphrase-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 2 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 1 | | | 0 | | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.6667 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("edwsiew/phantom-dispatch-01") # Run inference preds = model("1 gen with oil pressure shutdown alarm 2 genfail alarm is not showing up in site boss requestor banaag rommel requestor email rommel banaag verizonwireless com requestor phone 951 8342458") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:----| | Word count | 3 | 182.3273 | 915 | | Label | Training Sample Count | |:------|:----------------------| | 0 | 14 | | 1 | 41 | ### Training Hyperparameters - batch_size: (8, 8) - num_epochs: (1, 1) - max_steps: -1 - sampling_strategy: oversampling - num_iterations: 3 - body_learning_rate: (2e-05, 2e-05) - head_learning_rate: 2e-05 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: False ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:------:|:----:|:-------------:|:---------------:| | 0.0238 | 1 | 0.2379 | - | ### Framework Versions - Python: 3.12.0 - SetFit: 1.0.3 - Sentence Transformers: 3.0.1 - Transformers: 4.39.0 - PyTorch: 2.4.0+cu121 - Datasets: 2.21.0 - Tokenizers: 0.15.2 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```