StyleNet

This is apparently the year for neural networks in algorithmic art. A week or so ago, Leon Gatys, Alexander Ecker, and Matthias Bethge posted a research paper entitled ‘A Neural Algorithm of Artistic Style’.

Alex J. Champandard used the technique to produce the video above. He has also set up a twitter bot that will take your photos and translate them into the style of famous paintings. There is, of course, also a twitter #StyleNet hashtag for other people’s experiments.

There are already multiple implementations with source code available on Github, by JC Johnson, Brandon Amos (crossed with DeepDream), Kai Sheng Tai, Anders Boesen Lindbo Larsen, and Eben Olson.

At this point, the exploration of the artistic applications of StyleNet are still in the very early days. Some of the painting transformations are astonishingly effective, while others are less so. But they’re conceptually centered around relatively straightforward uses for the algorithm. There’s a lot of space for software artists to take this as a starting point and springboard into some totally unexpected applications.

(video via https://www.youtube.com/watch?v=56CoHGxRg7c)