sketch-rnn

Reversing recognition neural networks to generate content has been a big theme in 2015. The detailed blogging by hardmaru about different experiments has been one of the highlights of the trend.

The latest experiment is to use a recurrent neural net to make fake Chinese characters. Unlike hardmaru, I’m not a native speaker, so I can’t evaluate them for effectiveness. Native speakers seem to find them fairly convincing, though.

But there’s a deeper thing going on here: the reason the hardmaru was interested in this approach in the first place is that, unlike most other recent generators that work on pixels, this is an inherently vector-based technique.

Generating something that is more naturally represented with vectors gives two distinct advantages. 

First, the machine’s representation of the data more closely corresponds to the thing it is trying to represent. The ordered strokes of the vectors are much more meaningful than any pixel representation of them can be. 

Second, it makes it easier to keep close to an ordered result, where the chaos that is added is more meaningful because it naturally conforms to the rules that are creating the underlying structure.

It also opens the possibility of teaching an AI to draw. Indeed, hardmaru suggested the TU Berlin Sketch Database might be a possible future source data.

http://blog.otoro.net/2015/12/28/recurrent-net-dreams-up-fake-chinese-characters-in-vector-format-with-tensorflow/






A Time for Destiny: The Illustrious Career of Serenity Starlight Warhammer O'James during her First Three Years in the Space Fighters

The term “Mary Sue” has been thrown around a lot recently, mostly for characters it doesn’t really apply to. The term originates in a very specific fan fiction context, and in my opinion there are usually much better approaches to use for media criticism for works outside that context. Dismissing a character as a “Mary Sue” is a muddled way to approach a film character.

Unless you’re intentionally trying to evoke self-indulgent fan fiction. 

In which case, welcome to the rabbit hole, Alice.

For NaNoGenMo 2015, Cat’s Eye Technologies decided to create a computer generated novel that doesn’t succumb to Michael Cook’s more unpredictable stuff. The goal was to create a novel where the reader doesn’t lose interest because of the generative nature of the text. On the other hand, the MARYSUE generator was never intended to write good novels, just legible ones. Which is still a pretty lofty goal.

The generator is intentionally cast in the mode of an overwrought but inexperienced author. Which actually matches the abilities of the generator rather well: the author really is an immature adolescent struggling to get an exciting story down on paper while jumping between cool imagery and overly specific details about the things the author is obsessed with. Just, in this case, the author happens to be an adolescent robot.

By framing the generator in a way that matches its weaknesses, it lets us look beyond them and focus on its strengths. We treat the generator as a budding young author, obsessed with certain ideas and mystified by others. 

In doing so, we can start to point out the parts of the stories that work and can start to see what the author was trying to go for and maybe even be able to figure out how to help the author say them better–just as we might be able to respond to a young fan fiction author who is working out their feelings about the source material by writing a story in response.

Beyond choosing a frame that matches the abilities of the generator, the output itself is fairly sophisticated for a machine-written text. It’s based on the idea of a story compiler that looks for plot events and expands them into more detailed events. 

You can read the complete novel here or peruse the source code.




Endless Jingling (2014)

Merry Christmas! You really have to visit Endless Jingling to get the full effect.

And I do mean full effect.






Dwarf Fortress

When we’re talking about procedural generation, Dwarf Fortress is the skeleton zombie elephant in the room. There’s a nearly infinite number of procedural generation topics to discuss when it comes to Dwarf Fortress, so it’s hard to know where to start.

image

So let’s start where the game does: with the creation of a world.

The map generation part is fairly straightforward, though it models details like climate, vegetation, savagery, good, and evil. It goes through a set of phases, placing elevation, temperature, rivers, and so on. Each phase may be rejected if it fails to meet the minimum playable criteria.

It then simulates centuries of history to create an elaborate backstory for every place and culture you’ll encounter in the game. 

This is a textbook demonstration of the degree of overkill Dwarf Fortress brings to everything. Most of the details here have little effect on a playable fortress.

But the historical events are recorded as legends, which can show up in your dwarves’ artwork. And you can visit many of the historical sites in Adventure Mode, and even encounter some of the long-lived creatures that inhabit the generated world.

image

That attention to detail is the basic theme of Dwarf Fortress. That’s why it has legions of dedicated fans who overcame its inaccessible interface to mine out the gems hidden within.

You don’t have to go to Dwarf Fortress’s lengths to see the benefits of a generated history, though. Usually, when people talk about the history generation, they focus on the emergent complexity it creates: an elf raised by dwarves who rises to rule their kingdom and slay a megabeast, a poet invents a new kind of poetry and teaches it to her students, who use to to write epics about the elf king of the dwarves and so on. 

But creating a history for these worlds and remembering it has another consequence, one that doesn’t require quite so much emergent complexity. One thing that gives objects in the real world meaning is precisely that they do have a history. Even the most mass-produced object has a kind of metaphysical metadata that digital instances of a 3D model lack. 

By manufacturing a history for these procedurally generated objects, we can give them a little bit of this complexity. Linking the different people, places, and things in the world helps escape the trap of many systemically identical cookie-cutter results, one of the traditional weak points of procedural generation.

image






no people

For NaNoGenMo 2015, everestpipkin composed no people, which I can best describe as a poetry art book. 

The generator took images from Google Streetview, performed an image recognition on them, and then wrote poetry based on what it saw in the picture. 

image


The results are astonishingly evocative.

image


The poetry is written by expanding concepts with Wordnik, finding words that match in random Project Gutenberg texts, and then combining the results in particular ways.

image

It is presented without capitalization, as is typical of everest pipkin’s work.

image

The source code: http://ifyoulived.org/no_people.js
The complete novel: http://everestpipkin.itch.io/no-people




Simulated Walking Creatures

Behavioral animation is an ongoing field of research. Thomas Geijtenbeek, Michiel van de Panne, and Frank van der Stappen presented this paper in 2013 showing a simulation-based approach that incorporates biomechanical constraints. An evolutionary algorithm learns how to use the muscles to walk, and a refinement optimizes the muscle routing. Further development of this was incorporated into to Geijtenbeek’s thesis.

(via https://www.youtube.com/watch?v=kQ2bqz3HPJE)









Ordovician (ProcJam 2015)

Journey back to seas that teemed with forms of life that haven’t been seen in four hundred million years. 

This is, of course another procedural generation project from Tom Betts (In Ruins; Sir, You Are Being Hunted), created for Procjam 2015. 

The Ordovician is a great inspiration for procedural generation. There are many unknowns, leaving space to invent new creatures, but we have tantalizing glimpses of the alien appearances of conodonts, graptolites, and other now-extinct creatures that once populated our oceans. Thus the generated creatures can gain their order from primeval models, while embracing the chaotic possibilities of long-forgotten lifeforms.

http://tomnullpointer.itch.io/ordovician-







Age of Mythology

When Ensemble Studios made their next followup to Age of Kings, it was in a new engine with a mythical rather than historical setting. This let them revisit some of the aspects of both their previous games, setting-wise. More relevant for our purposes, the new engine meant a new approach to procedurally generating maps.

The Genie engine’s scripting was basically a set of rules for dropping in terrain features. Modders managed to do a lot with it, but it wasn’t really a full fledged language. Age of Mythology, on the other hand, introduced a new random map script for the BANG! engine that was a complete C-syntax Algol-like programming language, including variables and functions.

image

Age of Mythology has the now-standard suite of Ensemble random map scripts (including the Mediterranean one with a lake in the middle) and some new ones that change up the positioning and availability of resources. While none of the official maps were too crazy, there were random maps that changed the way the game plays, such as the migration and nomad maps, and maps that messed with the resource balance, such as the wood-poor Oasis. Prior Age of Empires games experimented too, but Age of Mythology did it with confidence.

Pushing the extremes is something that I feel isn’t always explored enough: making your procedural generators distinct from each other in ways that are visible to the player can go a long way in demonstrating just how much variability there is under the hood. Pushing an extreme like is is a form of order: an efficient cause that gives the player a window into how the hidden systems function.




Shorties - A Generative Text Project

I was recently sent the following information. It looks like a really fun contest, and I’ll be interested in seeing what comes out of it:

Based off of NaNoGenMo (National Novel Generation Month), Shorties is a fun way to continue the procedural/generative text fun! 

Check out the Shorties repo and submit your own creative code-generated short story.




The History of Text Generation

Did people procedurally generate text before computers? Yes! It turns out that there were a lot of different ways people have invented over the past thousand years, including the volvelle and the zairja. Find out more in this really awesome write up of pre-twentieth century text generation by Holly Gramazio of Math Marcault.

There were fortune-telling devices, poem generators, and spinning wheels that spelled out words.

There’s a fascinating aside about how text generation is easier in Semitic languages, because they are conventionally written without vowels, making it easier to find valid combinations of constants. 

(English, I would venture to say, is a fairly terrible language for text generation, given its many irregularities.)

There’s even a discussion about this poem generator, which includes a link to an online implementation of it. If you wanted to write experimental generated poetry in the 1600s, you’d have company.

The article also includes a nice long bibliography, if you want to read about any of this in much more depth.

http://mathesonmarcault.com/index.php/2015/12/15/randomly-generated-title-goes-here/