Robin Sloan - Writing with the machine: GPT-2 and text generation

The videos from Roguelike Celebration 2019 are online, which means I can show them to you. Since NaNoGenMo 2019 is right around the corner, I’m going to start with one that involves text generation.

Robin Sloan ordinarily writes fiction, but for this project he trained a computer to write fiction for him. Specifically, he used GPT-2, a recent neural network from Open AI. While many people have done interesting things with GPT-2 (such as Talk to Transformer) they mostly fall prey to GPT-2′s biggest weakness. It’s really good at generating text that’s locally coherent, but it doesn’t have any sense of a larger world. It’s like a really fancy predictive text algorithm, only caring about the next word.

Enter Robin Sloan. He retrained GPT-2 on a dataset of fantasy novels (important because the default model would produce generic prose instead of fantasy details). But more importantly, he constrained the generator to respond to carefully chosen prompts. You can see the details of how he did it in his talk, but the result is remarkable: little fantasy stories that have imaginative prose and tell a complete tale from beginning to tragic end.

Even though the stories are only a page long, that’s a remarkable achievement. NaNoGenMo has been inching towards that (with lots of poetic flourishes along the way) but while generators like A Time for Destiny by Cat’s Eye Technologies, or the Pulp Fantasy Novel Generator by Joel Davis have been working towards longer coherence, the last-mile prose generation has involved a lot of painstaking template writing. Robin Sloan’s work here bridges the gap with a generator that manages to be surprising and poetic.