ProcJam 2019

The other procedural generation thing that’s going on right now is ProcJam 2019. A gentle jam about making things that make things, this is the 7th ProcJam so there’s tons of projects from past jams to go look at, not to mention the videos, tutorials, and zines.

I’m looking forward to seeing what people make this year!




NaNoGenMo 2019

It’s November 1st, which means that it’s also the start of National Novel Generation Month. Teach a computer to write a novel!

As always, some people have started posting their projects in the Issues section of the 2019 GitHub repository, though the only completed one so far is a novelization of the start of a game of hide-and-seek.

Past years have always resulted in fascinating projects, of varying degrees of poetry and cohesiveness. I’ve been excited to watch the community get more effective at generating novels, whether that is more comprehensible plots or more poetic works that make use of the generativity of the project.




Everest Pipkin - Corpora as medium: on the work of curating a poetic textual dataset

As a follow-up to Robin Sloan’s talk (where he used a custom corpora to retrain GPT-2) I think Everest Pipkin’s talk is an important complement, because their main point is how a carefully curated personal dataset can be more effective for generativity than vast but generic corpora.

A corpus (plural corpora) is the term for a collection of data, originally in a textual context, such as training corpora for natural language processing. For a lot of generative projects (particularly ones that involve machine learning) the data used matters a lot. (It’s not limited to text: Helena Sarin has made similar remarks about her visual art practice, using her own art as a corpus.)

Everest’s point is that generative text comes from other text. “Care is really important”: we should work with data that we care about. As they say, “…computational power can’t make up for a lack of argument or poetics,” which to my mind is one of the big gaps in procgen right now. It’s easy to make a sophisticated generator where no one cares about the generated artifacts because the entire thing lacks motivation and soul. Everest presents a way forward, where our care for the input leads the viewer to care about the output. 




Robin Sloan - Writing with the machine: GPT-2 and text generation

The videos from Roguelike Celebration 2019 are online, which means I can show them to you. Since NaNoGenMo 2019 is right around the corner, I’m going to start with one that involves text generation.

Robin Sloan ordinarily writes fiction, but for this project he trained a computer to write fiction for him. Specifically, he used GPT-2, a recent neural network from Open AI. While many people have done interesting things with GPT-2 (such as Talk to Transformer) they mostly fall prey to GPT-2′s biggest weakness. It’s really good at generating text that’s locally coherent, but it doesn’t have any sense of a larger world. It’s like a really fancy predictive text algorithm, only caring about the next word.

Enter Robin Sloan. He retrained GPT-2 on a dataset of fantasy novels (important because the default model would produce generic prose instead of fantasy details). But more importantly, he constrained the generator to respond to carefully chosen prompts. You can see the details of how he did it in his talk, but the result is remarkable: little fantasy stories that have imaginative prose and tell a complete tale from beginning to tragic end.

Even though the stories are only a page long, that’s a remarkable achievement. NaNoGenMo has been inching towards that (with lots of poetic flourishes along the way) but while generators like A Time for Destiny by Cat’s Eye Technologies, or the Pulp Fantasy Novel Generator by Joel Davis have been working towards longer coherence, the last-mile prose generation has involved a lot of painstaking template writing. Robin Sloan’s work here bridges the gap with a generator that manages to be surprising and poetic.



Sounds like Roboglyphics, Ben Wiklund’s procedural handwriting library. I posted about it way back in 2015.




Gleb

While it lacks the slick presentation, ease-of-use, and top-to-bottom generation of Vulgar, Gleb can still a generate random phonology to serve as the basis for your next conlang.

There’s a live demo of a version of it online: https://gleb.000024.org/

Alex Fink, the creator, has a git repository for the project and links to a wiki list of software tools for constructed languages.

https://github.com/alexfink/random_language/tree/master/phonology







One Page Dungeon generator

This one-page dungeon generator by Oleg Dolya (also known for the Medieval Fantasy City Generator) goes beyond the basic throw-rooms-together dungeon generator to have a focus on content. It includes descriptions and room prompts, often with implied connections between keys and locked doors.

There’s a number of ways to add more structure to content like this, including cyclic dungeon generation, but one of my favorites is a little similar to what appears to be going on here with the keys.

Without looking at the source code, the design pattern behind the keys appears to be: 1. have a locked door 2. place a key in a room that can be reached without crossing that locked door. Pretty simple, but creates interesting structures for the players to discover. This can be elaborated to include a chain of things that the player can encounter and piece together the entire story.

One place I’ve seen this used to great effect is Emily Short’s Annals of the Parrigues, with some of the footnotes: a series of footnotes is written in a way that describes an arc, and then the generator is allowed to include the next element from the arc when it feels like it.

If you wanted to make a dungeon generator like this, you could add pairs or triplets of elements to include. A simple one might be a monster, its lair, and it’s most recent victim. Because you are hand-writing these element groups, you can make sure there’s an implied story or relationship. If you’ve got enough of these, mixed in with the rest of the content, they will still feel interesting even if the player sees them a few times.

Another related form of content to include is a recurring motif. Right now the One Page Dungeon generator creates a lot of dying elves in corners and similar things, which a skilled Dungeon Master could spin into a story. But promoting too much mushroomy content can feel like the generator is making a mistake. Pushing the uniqueness of the elements makes it look deliberate.

One thing that King of Dragon Pass (and Six Ages) has taught me is to not be afraid of being specific! Finding the coral-encrusted Mirror of the Dark Wizard Archronalax and the coral-encrusted throne of the Dark Wizard Archronalax  will make the players start to wonder where the coral-encrusted magic wand of the Dark Wizard Archronalax is. Having some hyper-specific elements that are spread out across many generated maps gives the generator character.

Think in terms of motifs, of aesthetic signals that can tie different parts of the generated level together, of ways to create emergent resonances between independently generated elements. Which brings me to my last point for now: the simplest way to have elements in your generator work with each other is to write them with a lot of suggestion and storyful-ness. One Page Dungeon is pretty good at this already: A reinforced coffin containing a blood-spattered spear and a cursed hammer? There’s clearly a story behind that!

While there are many existing dungeon generators, there’s plenty of room for new generators that have their own character and ways of telling stories.

There’s lots of space for more dungeon generators to explore!

https://watabou.itch.io/one-page-dungeon



Joe Baxter-Webb, the creator behind my favorite twitter bot a strange voyage is looking for artists to be involved in a @str_voyage art book.

One reason that I’m so fond of the twitter bot is that it is very good at creating imagery with its brief descriptions. Many artists have already been inspired by it in the past, and I expect many more will find it to be a rich source of mood and inspiration.

Procedural generation has been used for creativity prompts, serendipity engines, and inspiration sources from the beginning. For example, the random tables in roleplaying games aren’t there to dictate the exact outcome of the game, they are there to inject elements for the players to bounce off of and find challenging.




Generative and Possibility Space

This tutorial by Mike Cook is a great introduction to two foundational concepts in procedural generation.

Many generators can produce a lot of possible artifacts–they have large generative spaces. But that doesn’t mean that they can generate all of the possible artifacts–there might be things in the possibility space that we can describe but that the generator can’t create.

Mike Cook discusses those distinctions and what it means for the generators that we build.

These are pretty foundational aspects of generativity–there are other aspects of generativity that also shape our experience of the generated artifacts. For example, even when we have a large generative space, our perceptual space might be much smaller and just look like similar bowls of oatmeal. But, of course, the point of this tutorial is to get you started in thinking about generative spaces. There’s lots to learn about procgen–I’m definitely still learning new things!

http://www.possibilityspace.org/tutorial-generative-possibility-space/




formulanimations :: happy jumping

The video above is captured from a shader running in real-time. Making it involved no 3d models, no textures, no polygons, no global illumination. It’s all just shader math that procedurally generates the happy creature bouncing along.

Inigo Quilez (who has been mentioned around here before) recently quit his job and has been working on “graphics and creativity”. Judging by the results, this was a good call.