Christmas Carols - Blob Opera

In the lower-right-hand corner of Blob Opera there’s a Christmas Tree slider. Take a look!

https://artsandculture.google.com/experiment/blob-opera/AAHWrq360NcGbw?cp=e30.



lewisandquark:

image

I’ve been experimenting with generating Christmas carols using machine learning algorithms of various sizes. The smallest AIs, trained from scratch on a set of carols, tended to get confused about what exactly the carols are celebrating. GPT-2, larger and with some internet pretraining… still got confused. Its carol about Rudolph “the All-Gracious King of All the Earth” was unsettling enough before Joe Zieja’s dramatic reading and The Forever Now’s eerily calm arrangement.

Now in 2020 we have GPT-3, trained on so much of the internet that my challenge is now to stop it from simply repeating carols that it has memorized. Give it just three example carols and it will respond with another from the internet, which is technically exactly what it was rewarded for during training. To get it to generate new carols, I discovered I could give it three existing carols (I chose Hark! The Herald Angels Sing (“bruise in us the serpent’s head”), All My Heart This Night Rejoices (“sin away and for aye full atonement maketh”), and Run Run Rudolph (“and away went Rudolph a whizzing like a merry-go-round”)), plus just the title of a nonexistent carol. It would generate, for example:

Mild is Rudolph

Mild is Rudolph’s image in the snow
He has a shiny nose and knows how to cozy-up-nose
He has a little red nose and tiny little hooves
He persuaded a mischievous infant to go straight to sleep

Goodness, how downy is Rudolph’s hide
He has, oh his forehead, so plump and smooth and divine
He has a little black snout and tiny little hooves
Benevolence makes foolish young children go straight to sleep

Shy is Rudolph’s figure when he’s near
He has a twinkle, oh in his eye, and he’s kind of lean
He has a bright shiny nose, and tiny little hooves
He can make any child go straight to sleep

The AI has seen the existing Rudolph lore many times, however, so it would tend to drift back into singing Rudolph the Red-Nosed Reindeer. Sometimes it would only drift back partway.

The Carol of the Rudolphs

Rudolf the red-nosed reindeer
had a very shiny nose.
He kept it polished nice and clean with a white piece of soap.
All of the other reindeer
used to laugh and joke,
and call him names like ugly, Zonker, Dunderhead.
But their laughter stopped
when they saw him get
him speed on down the runway.

For he could change the course
of the wind
with his nose, and people called him
Very Bright.

Rudolf, the darting reindeer
knew the trail through the woods,
even in the fog he could spot his mark.
Eeee-merry, eeee-merry Christmas,
Yes, that is his song.

He could steer a craftsman’s plane,
Could lead a camel home out of a trance.
His tender little bat-like ears
could hear a whisper, from ten miles away.

He could roar like a lion.
He could fly, gliding with ease.
His little rusty wheel, he would use
to help create the most wonderful wheel of cheese.

Yes, Rudolf, the cherry nosed reindeer,
could do all of these things and more,
But the only thing his nose could
not do was smell. Delectable Waffles
Who knows what’s in Rudolph’s nose?
Only Rudolph knows –
And Rudolph won’t tell!

image

The “sing the praises of Rudolph” directive seems to come through very well, especially if I give it an appropriate title.

All Hail to Rudolph

All hail to Rudolph
The legend of Christmas
The most famous reindeer of all

All hail to Rudolph
Happy all the year
From everyone we see

Heroes come and heroes go
But Rudolph stays forevermore
Oh boys and girls of every age
Can’t you hear the sleigh bells ringing?

Oh look! There’s Santa and Parson Brown
Defying the laws of time and space
Now prepare yourself and your family my boy
Prepare to get your Christmas fix

Now Rudolph goes smashing through the roof
Just like a power lift with a candy-cane moustache
And the deer’s nose is dead-on, deadly accurate
Pow! Now it’s Rudolph versus Mars

All hail this joyful night
All hail the night folks sing
All hail to Rudolph
The legend of Christmas
The most famous reindeer of all

O Come Rudolph, Come

O Come Rudolph, Come Ye Faithful
Oh come, come holly, clad in white, Oh come ye faithful.
Oh come aloud! Oh come and join our choir,
Oh make a joyful noise till earth and heaven ring.

Caw caw caw oh, caw caw caw oh, Caw holly, Caw plum this Christmas.
Oh, caw caw caw, oh caw oh, caw oh oh oh, oh.

Even if I give it a title like “Rudolph is the Worst” it knows that the story of Rudolph ends with redemption. (That is, unless it goes into truly awful territory, there being a lot of ways of being the Worst in its internet training data.)

Rudolph is The Worst Reindeer

Rudolph the red-nosed reindeer
Was a stubborn old reindeer.
Never would you find him
Participating in Christmas.
Never entered his head,
Of decorating a sleigh.

Then along came Bumbleboo
One foggy Christmas Eve,
And he said, “Rudolph with your nose so bright,
Won’t you guide my sleigh tonight?”

Then all the other reindeer,
Gathered round to admiere him,
And they nodded their heads,
As they shouted out in cheer,
“Rudolph the red-nosed reindeer
You’ll go down in history.”

Other times it seemed to be going for counting songs and, like GPT-2, is weird when it counts things. Here my title prompt was “Rudolph of”.

Rudolph of Orange (Medley)

There are 15 different colors,
That you see within the shirt.
There is a booklet,
Which identifies each one.

You’re looking at a number one,
While the kids are knocking on your door.
Then we started laughing,
When we realized that Rudolph was near!

Courtesy of the index theory,
There are only twenty eight!
July is the eighth,
And that’s when Christmas is!

REFRAIN:

And 1 wrapped a nose
(Time now for teasing and for fun)
2 pictures, horns a cap,
3 hooves and legs and feet,
4 antlers sweeming!
5 eyes,
6 nostrils,
7 buttons, zipper,
8 bolts and bumpers,
9 flashing lights on his cap,
10 red nose,
11 furry chin,
12 rubsy ears,
13 plastic eyes,
14 laughing so joyously.
15 steps and a twisted reindeer train!
16 frosted incislotes,
17 dreamily red bow,
18 nose light,
19 velco on reins,
20 candy apple red.

Two reindeer he’ll need to pull him through!
It’ll be commercial Christmas,
For years to come,
When Rudolph gets his new reindeer team.

For a few more carols (including “O Little Rudolph” which involves ghosts and ghouls somehow), enter your email!

More of Janelle Shane’s neural network shenanigans, this time with GPT-3 generated holiday cheer!






Blob Opera 

David Li and Google Arts and Culture made this little machine-learning experiment to create opera-singing blobs.

The music is created with a machine learning model that learned about opera by listening to Tenor, Christian Joel, bass Frederick Tong, mezzo‑soprano Joanna Gamble and soprano Olivia Doutney (plus Ingunn Gyda Hrafnkelsdottir and John Holland-Avery). It’s not playing their audio–rather, it’s playing what the model thinks opera music sounds like based on analyzing 16 hours of their singing.

I like how the eyes watch the cursor - that’s one of those little procedural animation touches that’s simple but adds a lot of life.

https://artsandculture.google.com/experiment/blob-opera/AAHWrq360NcGbw?cp=e30.




How Watch Dogs: Legion’s ‘Play as Anyone’ Simulation Works | AI and Games

Tommy Thompson’s deep dives into AI in games frequently overlaps with procedural generation in games, and that’s certainly in the forefront of this latest video about the characters in Watch Dogs: Legion.

One thing that struck me is how the systems in the game are building on previous research. For example, the player-time character generation is partially inspired by research on alibi generation by Ben Sunshine-Hill and Norman Badler.

I’d like to know more details on how the generation of character details, because their ability to generate from partial data sounds like they’re using some form of constrain solving but I’m not sure.

It’s also worth pointing out that they don’t try to generate everything from scratch: recruitment missions, for example, have some basic narrative scaffolding that cab have the dynamic elements slotted in. This is a smart way to handle it: you don’t need to generate everything, and writing some solid content saves you time for enriching the generative system in ways that pay off more effectively.




Procedural Map Generation Techniques (Roguelike Celebration 2020)

This is a great survey of roguelike map generation techniques by Herbert Wolverson, covering a ton of different techniques. He uses animated illustrations to demonstrate how different algorithms work: random room placement, BSP subdivision (used by Nethack), cellular automata (good for caves), drunkard’s walk (aka random walk), diffusion limited aggregation (including adding attractors and symmetry for more control), Voronoi cells (plus using different distance algorithms), Perlin noise (commonly used for landscapes, but Ken Perlin’s invention is useful for a lot of stuff), and the use of prefabs (or “vaults”). He proposes the use of a Dijkstra map as a way of reading the map so you can do things like remove unreachable areas, or place the starting point in a reasonable spot, or figure out which parts of the map are on the golden path.

One important thing Herbert highlights is the importance of context and control in designing a level generator: the randomness is “fed into an algorithm that that approximates what you want to get but ensures that it is different every time.” 

He also talks about combining generators, which is something that can get lost when cataloging different techniques: there’s no one perfect algorithm for generating a roguelike map, and combining multiple techniques can result in something much more interesting than a single generative operation could make on its own.

His tutorial for making a roguelike in Rust has a lot of universally applicable algorithmic stuff, including even more map generation algorithms, so there’s also lots of material if you want to go beyond the already information-dense talk itself.

Overall, this is a great starting point for expanding your toolkit of map generation techniques. I particularly like Herbert’s examples for controlling the maps and combining techniques: there’s a lot of scope for getting creative with how you put together different elements of your map generator–particularly if they also tie in with other parts of your design or narrative.




There’s a Skull in My Garden

How can you get players to care about the things your generative system makes?

In this short Roguelike Celebration talk, Younès Rabii talks about player interpretation and meaningful stories, by way of his roguelike Tea Garden.

Tea Garden is influenced by Max Kreminski’s idea of gardening games, which posits a style of games that treat generative things as non-disposable, in contrast to the common “we’ll just generate another one” style exemplified by the extractive gameplay in things like Minecraft.

In Tea Garden you explore gardens in your dreams, and bring back dream flowers. Which means it has a flower generator. Problem: many of the flowers just looked like random jumbles of pixels. However, by adding interaction configurations of pixels that didn’t seem to have meaning created new avenues for those random flowers to be put in a new context and be perceived as meaningful.

Tea Garden has a very neat flower generator (though Younès does point out that it has some risks, hence the eponymous and entirely unexpected skull). But the larger point of creating context through interactivity is related to a lot of ideas I’ve been thinking about.

Generating something with context means that it exists as part of a larger system. Objects in the real-world have their history embedded in every scratch and smudge, so Dwarf Fortress’s history-tracking emulates some of that effect. A generator’s framing brings in pre-established context, borrowing from the rest of the world. Sharing content imports the context associations that other people have established into your own experience. And, in Tea Garden, the interactions create the affordances that let you impart your own context to the generated flowers.




Faces of Humanity

A project of face drawing generation using “Quick, Draw!” game data set from Google Creative Lab, millions of drawings from all around the world!

Each part of a face was drawn by a different person, at a different place, at a different time.

An unexpected global collaborative project with millions of contributors :)

One new face generated every hour on the project website : http://project.laboiteatortue.com/facesofhumanity

The project tumblr : https://facegenerator.tumblr.com 

Your question are welcome.

Cheers

————————-

This project got sent to me a while ago, but I think it’s a neat application of public data in a generative way. This use is reminiscent of Emily Short’s Beeswax–in this case, the use of crowdsourcing a corpora of drawings.




Why do I even like roguelikes?

Kind of a companion piece to last year’s Roguelike Celebration talk by Alexei Pepers about the different ways that practitioners approach procgen. Whereas Alexei’s talk was about the people who make roguelikes, Lisa Brown’s talk is about the motivations of the people who play roguelikes.

Inspired by the Quantic Foundary Gamer Motivation Model, Lisa describes roguelikes in terms of action, social, mastery, achievement, immersion, and creativity. She leverages her experience as a game designer to point out different parts of specific games and how they tie into the model.

Tracing the different gameplay elements from particular implementations to player motivations is valuable for understanding a how decisions that are made when constructing the mechanics of a game eventually resolve themselves in the aesthetic experience the players have. It’s also a useful reminder that different players have different preferences.

Which is yet another reason why there’s no one perfect generator. Every generative system has a purpose embedded in it. Or maybe embodied by it is a better phrasing–the shape of the system is a form of procedural rhetoric that tells us something about the thing it is modeling. And because different generative systems have different end goals, they naturally tend to have shapes that reflect those goals. When you are building a generator, you want to make something that has the right kind of generative space, where the range of artifacts that it produces is interesting but doesn’t contain anything broken or unusable.




NaNoGenMo 2020

It’s December first, and you know what that means–time to look at the 2020 National Novel Generation Month results!

There’s lots of fascinating things from this year, more than I’ve had time to dig into yet. I’m looking forward to it.

If you want to beat me to it and check them out for yourself, the submissions are in the Issues tab on the GitHub repository:

https://github.com/NaNoGenMo/2020/issues




Don’t Generate, Hash

This 2020 RoguelikeCelebration talk by Andrew Clifton is a great look at a practical, technical implementation detail that digs into some of the topics that come up once someone has implemented their first few generators and start trying to figure how to get more control.

Random seeds are the first place to start, but as Andrew points out just storing random seeds quickly runs into problems. Instead, he suggests using a hash function: which in this case is just a function that takes your data and returns a number that looks random. But, importantly, the same data always results in the same hash value.

If you’re using a standard PRNG, the result is very sensitive to the order that things are generated in. You can think of a PRNG as a long tape with random numbers printed on it: each time you get a new random number from it, it looks at the next number on the tape and gives you that one. (The actual implementations use clever math to achieve unpredictability without storing that long list of numbers.) The seed is where the sequence starts. Starting with the same seed gives you the same results–unless things are done in a different order.

A blatant example of how this works can be found in many of the Civilization games, where attacking with different units first can cause different outcomes because the RNG checks happen in a different order. Speedrunners also make use of this, particularly when the RNG is predictable enough to be externally manipulated. And there’s a lot of ways this can go wrong, for example if loading game objects from a save game means that they get initialized in a different order than the order they were originally created in.

But hashing doesn’t care about initialization order because the hash value always starts from the parameters that you give it. This does mean that the various parts of your game need to be things that hash to meaningfully different hashes: the parameters need to be different so that you get different results.

Frontier: Elite uses a hash to generate sectors. Each sector uses the hash as the seed and determines the number of stars and then goes from there to generate the planets and so on. A weakness of the parameters being fed into the hash means that some sectors repeat–which is useful as a player if you can figure out where the human outposts have been accidentally placed in the galactic core. But that’s usually not the kind of emergent randomness that you want, so pay attention to your parameters.