Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Jokes generated via GPT-2

Hi all, I have had a good play with finetuning the GPT-2 117M model on various datasets. I’m impressed with even this “dumbed-down” version’s ability to generate human-like text.

I got it to generate some Q&A style jokes, and a few have me scratching my head as to how these came about.

I’d love your input or feedback on what could be going on inside GPT-2 to make these associations?

I fine-tuned using this notebook but with a different dataset. The jokes below were generated using unconditional sampling during training.

The jokes generated were obviously not great, but there were some gems in there that seemed to be outside the norm of AI-generated humor.

In all of the below cases, the punchline words either didn’t appear in the fine-tuning corpus, or appeared only briefly in very different situations.

What do you call an obscure bird?

A falchion.

When I first read this, I thought it was funny because I assumed a falchion is actually an obscure bird. It’s not; it’s a sword. The main reference I could find on Google about a falchion bird is to the PPC wiki. That’s a super-obscure reference. Is GPT-2 actually referencing that page? And/or as a pun on “falcon”?

What do you get when you remove the battery of a car?

A carotid hemorrhage.

Again, no mention of carotid or hemorrhage in the fine-tuning set. Is this maybe a car-based pun? “Carotid hemorrhage” has about 3k results in Google but that is a pretty obscure reference. Any theories as to why GPT-2 selected this? Would GPT-2 use “carotid” as a response for “car” in a joke?

Did you know, the best thing about the Olympics is the smell of urine during them.

One of the very few one-liners produced in my samples. This one has some humor, but “the best thing about the Olympics is the” only returns 9 hits on Google and there’s nothing like this in the fine-tuning set. Any theories?

Who's the smartest game manager ever?

Eric B.

This appears to be an association to “Eric B.” (the rapper) and the idea of “the game” from rap? Again, no mention of Eric B in the fine-tuning set. This one blew my mind.

What does a cow say when it runs out of water?

You need a pump!

To a human, this joke sounds like it is anthropomorphizing the cow to be addressing its farmer. But how could GPT-2 get there? Again, nothing like this in the fine-tuning set (almost zero pump references, 17 “cow say” jokes but nothing of this nature), and nothing like the Q or A that I could find in Google.

Honestly my mind is really blown by these results and I’d love any ideas as to how the Transformer-based algorithm comes up with these types of associations.

These jokes are not prime-time-worthy, but they are among the best I’ve seen from AI and I’m really scratching my head as to how they got generated. Any theories or discussion would be much appreciated!

submitted by /u/simiansays
[link] [comments]