Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I took GPT-2-small, fine-tuned it with r/WritingPrompts data, and put it online.
Try it out here: https://www.thisstorydoesnotexist.com/
Shameless Self-Promotion:
Please follow me on my twitter for updates, and donate to me on Patreon to help keep stuff online.
Technical Details:
Training was done with: https://github.com/nshepperd/gpt-2 with some minor modifications
Training was done with batch size of 512 (batch 2 accumulated 256 times) for 1500 iterations using Adam with lr=1e-5.
Samples are generated with top_k=50, temperature=0.95.
The webservice is two n1 instances on GCP, an n1 instance with a K80 (preemptible), and a desktop computer with a 1080Ti in my basement.
If you have any questions about this project, please ask! 🙂
submitted by /u/eukaryote31
[link] [comments]