Edit model card

ayatsuri/academic-ai-detector

This model is a fine-tuned version of distilbert/distilbert-base-uncased on NicolaiSivesind/human-vs-machine dataset. It achieves the following best results on the evaluation set:

  • Train Loss: 0.0910
  • Validation Loss: 0.0326
  • Train Accuracy: 0.9937
  • Train Recall: 0.9927
  • Train Precision: 0.9947
  • Train F1: 0.9937
  • Validation Accuracy: 0.99
  • Validation Recall: 0.986
  • Validation Precision: 0.9940
  • Validation F1: 0.9900
  • Epoch: 0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 2625, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Set Loss Accuracy Recall Precision F1
Train 0.0910 0.9937 0.9927 0.9947 0.9937
Validation 0.0326 0.99 0.986 0.9940 0.9900

Framework versions

  • Transformers 4.41.1
  • TensorFlow 2.15.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1

Citation

Please use the following citation:

@misc {ayatsuri24,
  author    = { Bagas Nuriksan },
  title     = { Academic AI Detector },
  url       = { https://huggingface.co/ayatsuri/academic-ai-detector }
  year      = 2024,
  publisher = { Hugging Face }
}
Downloads last month
3,243
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ayatsuri/academic-ai-detector

Finetuned
(6658)
this model

Dataset used to train ayatsuri/academic-ai-detector