[P] This Story Does Not Exist
I took GPT-2-small, fine-tuned it with r/WritingPrompts data, and put it online.
Try it out here: https://www.thisstorydoesnotexist.com/
Shameless Self-Promotion:
Please follow me on my twitter for updates, and donate to me on Patreon to help keep stuff online.
Technical Details:
Training was done with: https://github.com/nshepperd/gpt-2 with some minor modifications
Training was done with batch size of 512 (batch 2 accumulated 256 times) for 1500 iterations using Adam with lr=1e-5.
Samples are generated with top_k=50, temperature=0.95.
The webservice is two n1 instances on GCP, an n1 instance with a K80 (preemptible), and a desktop computer with a 1080Ti in my basement.
If you have any questions about this project, please ask! 🙂
submitted by /u/eukaryote31
[link] [comments]