Yesterday, Rock Paper Shotgun posted an epic interview that digs deep into the level generation in Brogue, one of the roguelikes that gets consistently cited as a shining example of the pinnacle of modern roguelike design. Graham Smith’s interview with Brian Walker is an approachable look at how its procedural generation contributes to that.

Brogue’s level generation really stands out with its density of interesting effects. If you want to learn more about how a developer approaches procedural generation, or ideas for your own generation, you should definitely read this interview.








Audiosurf (2008)

Sometimes, unusual input can be the foundation for the entire game.

Audiosurf takes any arbitrary music track and turns it into a race course. This gives the player an immediate connection to the racetrack: as you race in response to your favorite song, you both anticipate the part that comes next and are curious about how the game is going to translate it.

It seems somewhat obvious, now that there are many games that use audio inputs, but the idea of automatically translating a song into a responsive racing track is the kind of off-the-wall use of procedural generation that can create new genres. This shouldn’t be confined to turning music into race tracks: Imagine exploring a dungeon generated from a photo or music created from your Twitter timeline.




More No Man’s Sky hype in the New Yorker. This time about the audio, which hasn’t been discussed as much as the visuals. Apparently, they built a simulation of a vocal tract, reminiscent of pre-electronic speaking machines.

Procedurally generated audio doesn’t get quite as much attention as procedurally generated visuals, which is a pity, because I think it can be even more challenging to get something that sounds good. It’s not something you can just throw in at the last minute. The audio experience is intrinsically linked to time (there’s no equivalent of a still image) and the wrong combinations can be accidentally harsh and painful in a way that visual images seldom reach.

(via https://soundcloud.com/newyorker/no-mans-sky-creatures)




the procedural generation moth, via mothgenerator




hitboxteam:

Prop/nicknack blueprints system is nearly finished. This shelf is populated from only 8 different nicknack blueprints so far. each blueprint has many random values to add even more variance. 




knock knock [tab]

Sometimes just jamming two concepts together can produce a completely new and potentially hilarious result. Consider @autocompletejok: a twitter bot created by the enigmatic @deathmtn. It takes a Google autocomplete result and turns it into a knock-knock joke. 

Obviously, this was never the intended purpose of autocomplete, but the bot manages to actually be funny on occasion, or at least dadaists. It becomes a kind of free-association, with the occasional horrifying glimpses into what happens when you merge the searches of our collective unconsciousness with an AI that is trying to understand what we’re looking for. A bit reminiscent of deepdream, really.







@mothgenerator

It’s apparently National Moth Week, which among other things involves an ongoing citizen project to track moth distribution. Important stuff. Around here, though, we’re going to talk about procedurally generated moths.

Loren Schmidt and everest pipkin collaborated to create a moth-generating-bot that’s already received a bit of attention. They recently added a feature that will generate a moth for you based on phrases you tweet at it.

From my point of view, the moths are impressively varied while still being moth-like. While insects are easier to construct out of whole cloth than mammals, it’s still not simple to keep them looking convincing.




Proteus (2013)

Wait, didn’t we just talk about Proteus?

Well, yes. But that post focused on the visual and interactive aesthetics of Proteus, and this time I want to talk about the other half of the game: the sound.

Listen to the recording above. It’s a complete four-act symphony, completely unique to my last playthrough of Proteus. There will never be another one exactly like it. It was created interactively, as I walked through the world and listened to the audible reactions.

If it was just about the visuals, I don’t think the game Ed Key and David Kanaga created would have quite the impact that it does. It’s too easy to glide past a sunset when you’re just using your eyes. Tying it to the music that surrounds you lets you slow down and absorb the world in a fresh new way.

(via https://soundcloud.com/procedural-generation/proteus-symphony)






Substrate (2003)

Shortly after the turn of the century, there was a new programming environment called Processing that was uniquely tailored to producing visualizations and generative art. Jared Tarbell was one of the artists who discovered it early. Jared had previously done many interactive generative artworks in Flash, releasing them as open source. He did the same with his Processing projects, including the one you see above: Substrate.

Substrate is, I think, one of his most significant works. It’s pretty simple: lines growing out of other lines, forming crystalline patterns. The result is something that appears complex: city streets in endless configurations. It’s no wonder it was one of Jared’s most commonly mentioned works at the time.

Speaking of his work as a whole, he wrote:

I write computer programs to create graphic images.

With an algorithmic goal in mind, I manipulate the work by finely crafting the semantics of each program. Specific results are pursued, although occasionally surprising discoveries are made.

I believe all code is dead unless executing within the computer. For this reason I distribute the source code of my programs in modifiable form. Modifications and extensions of these algorithms are encouraged.

The artist would later go on to help co-found a little website you’ve probably never heard of named Etsy.

Though the embedded Java version of Substrate is difficult to run in a browser (try updating the Java install in Firefox) the project has always been open source, and it runs just fine when ported to the latest version of Processing, as long as you can hunt down the “pollockShimmering.gif” it uses to derive its pallette.

The web page for Substrate on complexification.net has a number of images from different versions, showing the development process and the exploration of possibilities.

I think we can learn three things from Jared Tarbell: first, the algorithms he released. Second, a serendipitous, artistic approach to exploring the possibilities hidden in code. And lastly, that sometimes deep and unexpected results can come from following simple rules.








Proteus (2013)

I needed something soothing after last weekend, so I decided that it was time for a vacation in the relaxing symphony world of Proteus. 

The heart of Proteus is in its procedural generation. The postcards saved from the game with F9 even come with the address of the island, the seed that it grew from. This time I visited island #1457104792.

Proteus has a relatively wide and flat interactive structure: where you are on the island is the primary thing that determines what you see, not your actions beforehand. There is very gentle gating as the island progresses through the seasons, and so the result induces quiet contemplation rather than obsessive exploration.

Proteus doesn’t get its interest from huge variation or complex interactions. Instead, it composes vignettes that induce the perfect momentary mood, like a visual haiku. 

And that’s the heart of Proteus: it’s a place where it’s worth stopping to sit and watch the sun set.