Generated Poetry: X except its Y
Poetry generated using X except its Y on permutations of the lyrics from the first 4 Nine Inch Nails albums.
It was part of enkiv2′s National Poetry Generation Month work for 2017. It uses word2vec to combine a source text with a trained style. Quite effective.
Though I think that in the most striking imagery it’s a bit prone to plagiarism, as a side effect of how it rotates through the combinations. As you can see, some of the lines are the exact words in the original lyrics.
Plagiarism, in the generative sense, is what happens when an algorithm trained on input data, such as a Markov chain, outputs verbatim from the source data.
Markov chains with too little data or too high an order are particularly prone to this problem. But other algorithms are as well; it’s particularly tricky with many kinds of machine learning, and it’s one reason why it is important to keep the training and validation sets separate from the testing set.
These X except its Y results aren’t quite the same as generative plagiarism in that sense. Indeed, the effect of combining the two texts is part of the point, I think. But the concept comes up a lot and it’s worth critiquing results with it in mind.
Does the reoccurrence of a familiar line sufficiently counterbalance the way it shows the limitations of the generation? For me, I think it tends to highlight how the original uses repetition to create a resonance in its structure that the generated poetry is unaware of. But part of generative poetry’s draw is exactly how it can take the familiar and recontextualize it.
How do the dissonantly different word choices change the effect of the lyrics?
https://github.com/enkiv2/misc/tree/master/napogenmo2017