A GPT2-type neural network trained on 16 gigabytes of Pyhon scripts from scratch. It has 50 million parameters.
Made as a toy.