Everest Pipkin: Digital Flowers as Status Symbols // Interrupt 4

I’ve frequently mentioned everest pipkin’s work on this blog, and for good reason. Their work frequently explores the interesting edges of generative art.

In this talk from Inturrupt 4, they draw a parallel between the history of flowers as status symbols and video game foliage–specifically digital flowers that serve as status symbols. The talk also touches on the relationship between time spent in a game and the status symbols conferred by the game, and juxtaposes that against our ideas of leisure and busyness.

This talk feels extra-relevant to me now, what with the conversations about loot boxes and the news about the real-time timers in the new Animal Crossing. But it reminded me that these ideas apply to a lot of generative content–what is the relationship between generativity and scarcity, and how does that affect how we value the things we generate? What are the ethics around games that demand grinding labor in exchange for our leisure time? Can we use generativity to improve our relationship with the machines that we’ve turned our society over to?




NaNoGenMo 2017

It’s nearly November, which means that it’s nearly time to start making novel generators again!

NaNoGenMo, for those of you who haven’t heard of it, is National Novel Generation Month: spend November writing a thing that writes a novel.

Hundreds of novel generators have been made over the past few years, some of them with quite interesting results.

The way the event itself is organized is by using the Issues system on GitHub as a forum. Just click on the Issues tab and start a thread via “New Issue”.

You don’t need to know anything about Git to participate. People often upload their generators to GitHub because it’s an obvious way to share and archive source code, but you can use whatever you feel like. The only requirements are that you make something that writes 50,000 words and that you share the source code afterwards.

Even if you don’t have time to participate this year, you should check out the resources threads, which have become some of the best compilations of tools and techniques for text generation.

People are already announcing their intent to participate, and it is shaping up to be a banner year for text generation.

https://nanogenmo.github.io/












Noisemaker Bot

I was poking around the internet, looking for good uses of procedural noise. (There are fewer comprehensive catalogs than you’d think.) And I came across Noisemaker Bot, a twitter bot by Alex Ayers that is combining various noise generators and functions to create patterns. Interestingly, it represents images as 32-bit tensors, which is, I think, rare in game applications but more common in scientific fields.

The bot has already generated a wide range of different designs, assembled from its large catalog of operations.

Noise-based patterns like these are more useful than just being pretty things to look at: deterministic, stateless noise pattern generation can be continued indefinitely while smoothly transitioning between any two given points, making it perfect for things like generating terrain. Or adding a bit of jitter to a procedural animation to give it life. Or animating screenshake. Or to pattern a fabric. Or to adjust timing delays on a data visualization to give it a more organic feel.

Any structured, quantitative signal can be used as an input to drive a whole host of different things, which is why I keep talking about unusual inputs. Generation needs structure, but that structure doesn’t necessarily have to be a realistic replication of anything.

https://twitter.com/noisemakerbot




Cogmind

I’ve mentioned Cogmind before. Back in the early days of the blog, I talked about how the developer of X@Com was working on a roguelike. This week it’s finally releasing on Steam. It’s basically content complete at this point, and the extra development time is slated for further fleshing it out with polish and optional features.

The Alpha Trailer above will at least give you a sense of just how far it takes the ASCII aesthetic. Alongside Caves of Qud and Jupiter Hell, this is one of the handful of modern roguelikes that are pushing traditional roguelike aesthetics to the edge.

http://www.gridsagegames.com/cogmind/




ProcJam Tutorials Released

Not to be all ProcJam all the time–though it is getting to be that time of year–but I definitely wanted to tell you about the ProcJam tutorials they’re posting this year. They have an excellent selection of experts lined up, and the first two tutorials were posted today.

I’m always on the lookout for material that can help introduce people to the world of generation, whether its analog/paper generation, a Unity3D tutorial, or an intro to installing neural net software for poets. (Do send me links if you know of more resources for this!)

Once you have something to play around with it can be easy to teach yourself (or at least you have a clear path) but that first step can be a hard one. And with so many frameworks and different approaches out there, there are sometimes many different first steps, like so many procedurally generated staircases to climb.

http://www.procjam.com/tutorials/




ProcJam Art Packs for 2017

The art packs for the 2017 ProcJam have been released, and they are as amazing as expected. Together with the art packs from previous years, this means hundreds of 3d models and sprites are available under a CC0 license for anyone who wants to use them.

The artists should be congratulated. It’s not a topic that gets discussed much, outside of the circles of artists who are working on generative projects, but there’s a number of additional constraints and considerations when you’re working on art that’s going to be used as part of a generative output. They have to work as individual art pieces and when repeated and fit in with the surrounding art–often with no idea of exactly what that art might be. It’s like tiling texture design turned up to 11.

For many generators, the choice of content is an intrinsic part of the system itself: a text generator working with a Tracery grammar is defined by the text that’s put into that grammar. A level generator can only work with the tiles it is given.

This even holds in a larger sense: for example, if transitions between terrain types are part of a tiled map generator, then the shape and kind of transitions that are allowed are dictated by what art assets are available. These relationships are sometimes addressed programmatically–you can always add more generativity within generativity, after all–but the general idea remains.

An artist doesn’t have to touch the code to have an influence on a generator.








Google Assistant

I was a bit surprised when the notification popped up to tell me that my new panorama was ready. I’d shot the photos for it, of course, but I hadn’t gotten around to putting them together. And then it followed it up with an animation, assembled from a burst of photos, and put a filter on yet more photos.

I’m not sure when my Google account started doing this. I think I might have turned Google Assistant on accidentally. It’s ability to detect what I intended to do with the photos is a bit unsettling. 

This kind of AI-assistant automation is becoming more prevalent, no doubt inching towards the Star Trek computer we always wanted. But, for me personally, this is the first time an AI I didn’t know has given me a present.

I’ve had bots give me presents before, of course: Appropriate Tributes sent a tweet to me the other day, for example. But this is like getting a surprise present from a stranger.

It’s not an entirely bad feeling: there is some surprise and delight at the results. But it is unpleasant to have a stranger show up with a present, and it feels socially off.

I assume that I technically did opt-in, or at least I think I remember enabling some new Google service without realizing that it did this. And I have no idea what else it’s doing: what if it decides that it doesn’t like my photos or that I don’t need to own a copy of my own files and deletes them?

Don’t laugh: this actually happened with iTunes.

So there are many ethical and social issues with AI, particularly when it starts directly managing our things and inserting itself into our lives without waiting for explicit permission.

Though the point where I’m really going to feel weird is when the bot pops up and tells me that it’s written my next blog post for me…




“Experimental Creative Writing with the Vectorized Word” by Allison Parrish

I’ve been a bit busy (more on that in future posts) but I had to stop and watch this talk by Allison Parrish about using word vectors for creative writing.

It’s a good introduction to both word vectors and some things you can do with them. Colors! Distributional hypothesis! Visualizing word vectors, which is a really neat way to learn about them!

The overall theme of the talk is about treating the written language as smooth spaces rather than discrete words, so you can do things like apply a blur to a paragraph. The analogy of a theremin for text is both a striking image and a call for a different kind of relationship with text, drawing on ideas from Amiri Baraka.

It’s especially worth watching for the poetry reading. The audible performance highlights how sound and the emotion are an essential part of language, yet tend to have less attention paid to it in some parts of the generative space.

https://www.youtube.com/watch?v=L3D0JEA1Jdc




Possibility Space: Videos About Procedural Generation

Mike Cook has started a video series about procedural generation. The first video is about character generation in the just-released Heat Signature.

As the founder of ProcJam and long-time researcher in the procgen space, Mike Cook is an expert at talking about this stuff, and I’m excited to find out what topics he’s going to cover.

You should definitely give it a look: https://www.youtube.com/watch?v=y-3xJ86zNaQ




Warping

Being able to deform things is one of the basic building blocks of generative graphics. Transforms in general are useful, and warped distortions add interest in an unpredictable but structured way.

This article by Íñigo Quílez is a tutorial in how to use noise based warping with Fractal Brownian Motion. Code is included.

Like many fractal techniques, this warping smoothly interpolates as the input parameters change. This means that it can be used for animations and blending–and is easier to dial in to the exact look you want.

It also means that you can combine it with other techniques more easily–like I said, deformation is a basic generative building block. You can use it to warp other textures, animate a fire, subtly add chaos to the spawning regions of your a-life creatures, or affect the local parameters in your level generator.

http://www.iquilezles.org/www/articles/warp/warp.htm