fyprocessing:

(via Skyline on Vimeo)

from Raven Kwok

Skyline is a code-based generative music video I directed and programmed for the track Skyline (itunes.apple.com/us/album/skyline-single/id1039135793) by Karma Fields (soundcloud.com/karmafields). The entire music video consists of multiple stages that are programmed and generated using Processing.

One of the core principles for generating the visual patterns in Skyline is Voronoi tessellation. This geometric model dates back to 1644 in René Descartes’s vortex theory of planetary motion, and has been widely used by computational artists, for example, Robert Hodgin (vimeo.com/207637), Frederik Vanhoutte (vimeo.com/86820638), Diana Lange (flickr.com/photos/dianalange/sets/72157629453008849/), Jon McCormack (jonmccormack.info/~jonmc/sa/artworks/voronoi-wall/), etc.

In Skyline’s systems, seeds for generating the diagram are sorted into various types of agents following certain behaviors and appearance transformations. They are driven by either the song’s audio spectrum with different customized layouts, or animated sequence of the vocalist, collectively forming a complex and organic outcome.

Behind the Scenes on Skyline’s generative music video

A look at the process behind the processes of one of Raven Kwok’s generative music videos.

I’m fond of generative music visualization. It’s a kind of code-based synthesia, reacting to the sound and translating it into another form.

Voronoi tessellation comes up a lot, too. It’s one of the basic building blocks that you can use to build more complex structures.