GPT-NoSleep-355m / README.md
DarwinAnim8or's picture
Update README.md
fa4355d
|
raw
history blame
1.05 kB
metadata
license: mit
datasets:
  - chloeliu/reddit_nosleep_posts
language:
  - en
tags:
  - fun
  - horror
  - writing
widget:
  - text: '[WP] We don''t go to ravenholm anymore [RESPONSE] '
    example_title: '[WP] We don''t go to ravenholm anymore [RESPONSE] '
co2_eq_emissions:
  emissions: 60
  source: https://mlco2.github.io/impact/#compute
  training_type: fine-tuning
  geographical_location: Oregon, USA
  hardware_used: 1 T4, Google Colab

GPT-NoSleep-355m

A finetuned version of GPT2-Medium on the 'reddit-nosleep-posts' dataset. (Linked above)

Training Procedure

This was trained on the 'reddt-nosleep-posts' dataset, using the "HappyTransformers" library on Google Colab. This model was trained for X epochs with learning rate 1e-2.

Biases & Limitations

This likely contains the same biases and limitations as the original GPT2 that it is based on, and additionally heavy biases from the dataset. It likely will generate offensive output.

Intended Use

This model is meant for fun, nothing else.